WorldWideScience

Sample records for probabilistic source term

  1. Probabilistic source term predictions for use with decision support systems

    International Nuclear Information System (INIS)

    Grindon, E.; Kinniburgh, C.G.

    2003-01-01

    Full text: Decision Support Systems for use in off-site emergency management, following an incident at a Nuclear Power Plant (NPP) within Europe, are becoming accepted as a useful and appropriate tool to aid decision makers. An area which is not so well developed is the 'upstream' prediction of the source term released into the environment. Rapid prediction of this source term is crucial to the appropriate early management of a nuclear emergency. The initial source term prediction would today be typically based on simple tabulations taking little, or no, account of plant status. It is the interface between the inward looking plant control room team and the outward looking off-site emergency management team that needs to be addressed. This is not an easy proposition as these two distinct disciplines have little common basis from which to communicate their immediate findings and concerns. Within the Euratom Fifth Framework Programme (FP5), complementary approaches are being developed to the pre-release stage; each based on software tools to help bridge this gap. Traditionally source terms (or releases into the environment) provided for use with Decision Support Systems are estimated on a deterministic basis. These approaches use a single, deterministic assumption about plant status. The associated source term represents the 'best estimate' based an available information. No information is provided an the potential for uncertainty in the source term estimate. Using probabilistic methods the outcome is typically a number of possible plant states each with an associated source term and probability. These represent both the best estimate and the spread of the likely source term. However, this is a novel approach and the usefulness of such source term prediction tools is yet to be tested on a wide scale. The benefits of probabilistic source term estimation are presented here; using, as an example, the SPRINT tool developed within the FP5 STERPS project. System for the

  2. Probabilistic Dose Assessment from SB-LOCA Accident in Ujung Lemahabang Using TMI-2 Source Term

    Directory of Open Access Journals (Sweden)

    Sunarko

    2017-01-01

    Full Text Available Probabilistic dose assessment and mapping for nuclear accident condition are performed for Ujung Lemahabang site in Muria Peninsula region in Indonesia. Source term is obtained from Three-Mile Island unit 2 (TMI-2 PWR-type SB-LOCA reactor accident inverse modeling. Effluent consisted of Xe-133, Kr-88, I-131, and Cs-137 released from a 50 m stack. Lagrangian Particle Dispersion Method (LPDM and 3-dimensional mass-consistent wind field are employed to obtain surface-level time-integrated air concentration and spatial distribution of ground-level total dose in dry condition. Site-specific meteorological data is obtained from hourly records obtained during the Site Feasibility Study period in Ujung Lemahabang. Effluent is released from a height of 50 meters in uniform rate during a 6-hour period and the dose is integrated during this period in a neutrally stable atmospheric condition. Maximum dose noted is below regulatory limit of 1 mSv and radioactive plume is spread mostly to the W-SW inland and to N-NE from the proposed plant to Java Sea. This paper has demonstrated for the first time a probabilistic analysis method for assessing possible spatial dose distribution, a hypothetical release, and a set of meteorological data for Ujung Lemahabang region.

  3. Some problems in the categorization of source terms

    International Nuclear Information System (INIS)

    Abbey, F.; Dunbar, I.H.; Hayns, M.R.; Nixon, W.

    1985-01-01

    In recent years techniques for calculating source terms have been considerably improved. It would be unfortunate if the new information were to be blurred by the use of old schemes for the categorization of source terms. In the past categorization schemes have been devised without the question of the general principles of categorization and the available options being addressed explicitly. In this paper these principles are set out, providing a framework within which categorization schemes used in past probabilistic risk assessments and possible future improvements are discussed. In particular the use of input from scoping consequence calculations in deciding how to group source terms, and the question of how modelling uncertainties may be expressed as uncertainties in a final category source terms are considered

  4. Improved Point-source Detection in Crowded Fields Using Probabilistic Cataloging

    Science.gov (United States)

    Portillo, Stephen K. N.; Lee, Benjamin C. G.; Daylan, Tansu; Finkbeiner, Douglas P.

    2017-10-01

    Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (˜0.1 sources per pixel brighter than 22nd mag in F606W) Sloan Digital Sky Survey r-band image from M2. Probabilistic cataloging returns an ensemble of catalogs inferred from the image and thus can capture source-source covariance and deblending ambiguities. By comparing to a traditional catalog of the same image and a Hubble Space Telescope catalog of the same region, we show that our catalog ensemble better recovers sources from the image. It goes more than a magnitude deeper than the traditional catalog while having a lower false-discovery rate brighter than 20th mag. We also present an algorithm for reducing this catalog ensemble to a condensed catalog that is similar to a traditional catalog, except that it explicitly marginalizes over source-source covariances and nuisance parameters. We show that this condensed catalog has a similar completeness and false-discovery rate to the catalog ensemble. Future telescopes will be more sensitive, and thus more of their images will be crowded. Probabilistic cataloging performs better than existing software in crowded fields and so should be considered when creating photometric pipelines in the Large Synoptic Survey Telescope era.

  5. Review and evaluation of the Millstone Unit 3 probabilistic safety study. Containment failure modes, radiological source - terms and offsite consequences

    International Nuclear Information System (INIS)

    Khatib-Rahbar, M.; Pratt, W.; Ludewig, H.

    1985-09-01

    A technical review and evaluation of the Millstone Unit 3 probabilistic safety study has been performed. It was determined that; (1) long-term damage indices (latent fatalities, person-rem, etc.) are dominated by late failure of the containment, (2) short-term damage indices (early fatalities, etc.) are dominated by bypass sequences for internally initiated events, while severe seismic sequences can also contribute significantly to early damage indices. These overall estimates of severe accident risk are extremely low compared with other societal sources of risk. Furthermore, the risks for Millstone-3 are comparable to risks from other nuclear plants at high population sites. Seismically induced accidents dominate the severe accident risks at Millstone-3. Potential mitigative features were shown not to be cost-effective for internal events. Value-impact analysis for seismic events showed that a manually actuated containment spray system might be cost-effective

  6. Regulatory impact of nuclear reactor accident source term assumptions. Technical report

    International Nuclear Information System (INIS)

    Pasedag, W.F.; Blond, R.M.; Jankowski, M.W.

    1981-06-01

    This report addresses the reactor accident source term implications on accident evaluations, regulations and regulatory requirements, engineered safety features, emergency planning, probabilistic risk assessment, and licensing practice. Assessment of the impact of source term modifications and evaluation of the effects in Design Basis Accident analyses, assuming a change of the chemical form of iodine from elemental to cesium iodide, has been provided. Engineered safety features used in current LWR designs are found to be effective for all postulated combinations of iodine source terms under DBA conditions. In terms of potential accident consequences, it is not expected that the difference in chemical form between elemental iodine and cesium iodide would be significant. In order to account for the current information on source terms, a spectrum of accident scenerios is discussed to realistically estimate the source terms resulting from a range of potential accident conditions

  7. Some practical implications of source term reassessment

    International Nuclear Information System (INIS)

    1988-03-01

    This report provides a brief summary of the current knowledge of severe accident source terms and suggests how this knowledge might be applied to a number of specific aspects of reactor safety. In preparing the report, consideration has been restricted to source term issues relating to light water reactors (LWRs). Consideration has also generally been restricted to the consequences of hypothetical severe accidents rather than their probability of occurrence, although it is recognized that, in the practical application of source term research, it is necessary to take account of probability as well as consequences. The specific areas identified were as follows: Exploration of the new insights that are available into the management of severe accidents; Investigating the impact of source term research on emergency planning and response; Assessing the possibilities which exist in present reactor designs for preventing or mitigating the consequences of severe accidents and how these might be used effectively; Exploring the need for backfitting and assessing the implications of source term research for future designs; and Improving the quantification of the radiological consequences of hypothetical severe accidents for probabilistic safety assessments (PSAs) and informing the public about the realistic risks associated with nuclear power plants. 7 refs

  8. Fully probabilistic seismic source inversion – Part 1: Efficient parameterisation

    Directory of Open Access Journals (Sweden)

    S. C. Stähler

    2014-11-01

    Full Text Available Seismic source inversion is a non-linear problem in seismology where not just the earthquake parameters themselves but also estimates of their uncertainties are of great practical importance. Probabilistic source inversion (Bayesian inference is very adapted to this challenge, provided that the parameter space can be chosen small enough to make Bayesian sampling computationally feasible. We propose a framework for PRobabilistic Inference of Seismic source Mechanisms (PRISM that parameterises and samples earthquake depth, moment tensor, and source time function efficiently by using information from previous non-Bayesian inversions. The source time function is expressed as a weighted sum of a small number of empirical orthogonal functions, which were derived from a catalogue of >1000 source time functions (STFs by a principal component analysis. We use a likelihood model based on the cross-correlation misfit between observed and predicted waveforms. The resulting ensemble of solutions provides full uncertainty and covariance information for the source parameters, and permits propagating these source uncertainties into travel time estimates used for seismic tomography. The computational effort is such that routine, global estimation of earthquake mechanisms and source time functions from teleseismic broadband waveforms is feasible.

  9. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  10. Development of a Risk-Based Probabilistic Performance-Assessment Method for Long-Term Cover Systems - 2nd Edition

    International Nuclear Information System (INIS)

    HO, CLIFFORD K.; ARNOLD, BILL W.; COCHRAN, JOHN R.; TAIRA, RANDAL Y.

    2002-01-01

    A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings)

  11. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P.

    2012-09-01

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  12. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  13. Application of the source term code package to obtain a specific source term for the Laguna Verde Nuclear Power Plant

    International Nuclear Information System (INIS)

    Souto, F.J.

    1991-06-01

    The main objective of the project was to use the Source Term Code Package (STCP) to obtain a specific source term for those accident sequences deemed dominant as a result of probabilistic safety analyses (PSA) for the Laguna Verde Nuclear Power Plant (CNLV). The following programme has been carried out to meet this objective: (a) implementation of the STCP, (b) acquisition of specific data for CNLV to execute the STCP, and (c) calculations of specific source terms for accident sequences at CNLV. The STCP has been implemented and validated on CDC 170/815 and CDC 180/860 main frames as well as on a Micro VAX 3800 system. In order to get a plant-specific source term, data on the CNLV including initial core inventory, burn-up, primary containment structures, and materials used for the calculations have been obtained. Because STCP does not explicitly model containment failure, dry well failure in the form of a catastrophic rupture has been assumed. One of the most significant sequences from the point of view of possible off-site risk is the loss of off-site power with failure of the diesel generators and simultaneous loss of high pressure core spray and reactor core isolation cooling systems. The probability for that event is approximately 4.5 x 10 -6 . This sequence has been analysed in detail and the release fractions of radioisotope groups are given in the full report. 18 refs, 4 figs, 3 tabs

  14. Estimation of Source terms for Emergency Planning and Preparedness

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Chul Un; Chung, Bag Soon; Ahn, Jae Hyun; Yoon, Duk Ho; Jeong, Chul Young; Lim, Jong Dae [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Kang, Sun Gu; Suk, Ho; Park, Sung Kyu; Lim, Hac Kyu; Lee, Kwang Nam [Korea Power Engineering Company Consulting and Architecture Engineers, (Korea, Republic of)

    1997-12-31

    In this study the severe accident sequences for each plant of concern, which represent accident sequences with a high core damage frequency and significant accident consequences, were selected based on the results of probabilistic safety assessments and source term and time-histories of various safety parameters under severe accidents. Accidents progression analysis for each selected accident sequence was performed by MAAP code. It was determined that the measured values, dose rate and radioisotope concentration, could provide information to the operators on occurrence and timing of core damage, reactor vessel failure, and containment failure during severe accidents. Radioactive concentration in the containment atmosphere, which may be measured by PASS, was estimated. Radioisotope concentration in emergency planning, evaluation of source term behavior in the containment, estimation of core damage degree, analysis of severe accident phenomena, core damage timing, and the amount of radioisotope released to the environment. (author). 50 refs., 60 figs.

  15. Uncertainty Quantification in Earthquake Source Characterization with Probabilistic Centroid Moment Tensor Inversion

    Science.gov (United States)

    Dettmer, J.; Benavente, R. F.; Cummins, P. R.

    2017-12-01

    This work considers probabilistic, non-linear centroid moment tensor inversion of data from earthquakes at teleseismic distances. The moment tensor is treated as deviatoric and centroid location is parametrized with fully unknown latitude, longitude, depth and time delay. The inverse problem is treated as fully non-linear in a Bayesian framework and the posterior density is estimated with interacting Markov chain Monte Carlo methods which are implemented in parallel and allow for chain interaction. The source mechanism and location, including uncertainties, are fully described by the posterior probability density and complex trade-offs between various metrics are studied. These include the percent of double couple component as well as fault orientation and the probabilistic results are compared to results from earthquake catalogs. Additional focus is on the analysis of complex events which are commonly not well described by a single point source. These events are studied by jointly inverting for multiple centroid moment tensor solutions. The optimal number of sources is estimated by the Bayesian information criterion to ensure parsimonious solutions. [Supported by NSERC.

  16. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  17. From probabilistic forecasts to statistical scenarios of short-term wind power production

    DEFF Research Database (Denmark)

    Pinson, Pierre; Papaefthymiou, George; Klockl, Bernd

    2009-01-01

    on the development of the forecast uncertainty through forecast series. However, this additional information may be paramount for a large class of time-dependent and multistage decision-making problems, e.g. optimal operation of combined wind-storage systems or multiple-market trading with different gate closures......Short-term (up to 2-3 days ahead) probabilistic forecasts of wind power provide forecast users with highly valuable information on the uncertainty of expected wind generation. Whatever the type of these probabilistic forecasts, they are produced on a per horizon basis, and hence do not inform....... This issue is addressed here by describing a method that permits the generation of statistical scenarios of short-term wind generation that accounts for both the interdependence structure of prediction errors and the predictive distributions of wind power production. The method is based on the conversion...

  18. Is Probabilistic Evidence a Source of Knowledge?

    Science.gov (United States)

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  19. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, D.; Brunett, A.; Passerini, S.; Grelle, A.; Bucknor, M.

    2017-06-26

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. The mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.

  20. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    Science.gov (United States)

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  1. A probabilistic analysis of cumulative carbon emissions and long-term planetary warming

    International Nuclear Information System (INIS)

    Fyke, Jeremy; Matthews, H Damon

    2015-01-01

    Efforts to mitigate and adapt to long-term climate change could benefit greatly from probabilistic estimates of cumulative carbon emissions due to fossil fuel burning and resulting CO 2 -induced planetary warming. Here we demonstrate the use of a reduced-form model to project these variables. We performed simulations using a large-ensemble framework with parametric uncertainty sampled to produce distributions of future cumulative emissions and consequent planetary warming. A hind-cast ensemble of simulations captured 1980–2012 historical CO 2 emissions trends and an ensemble of future projection simulations generated a distribution of emission scenarios that qualitatively resembled the suite of Representative and Extended Concentration Pathways. The resulting cumulative carbon emission and temperature change distributions are characterized by 5–95th percentile ranges of 0.96–4.9 teratonnes C (Tt C) and 1.4 °C–8.5 °C, respectively, with 50th percentiles at 3.1 Tt C and 4.7 °C. Within the wide range of policy-related parameter combinations that produced these distributions, we found that low-emission simulations were characterized by both high carbon prices and low costs of non-fossil fuel energy sources, suggesting the importance of these two policy levers in particular for avoiding dangerous levels of climate warming. With this analysis we demonstrate a probabilistic approach to the challenge of identifying strategies for limiting cumulative carbon emissions and assessing likelihoods of surpassing dangerous temperature thresholds. (letter)

  2. A probabilistic justification for using tf.idf term weighting in information retrieval

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2000-01-01

    This paper presents a new probabilistic model of information retrieval. The most important modeling assumption made is that documents and queries are defined by an ordered sequence of single terms. This assumption is not made in well known existing models of information retrieval, but is essential

  3. Very-short-term wind power probabilistic forecasts by sparse vector autoregression

    DEFF Research Database (Denmark)

    Dowell, Jethro; Pinson, Pierre

    2016-01-01

    A spatio-temporal method for producing very-shortterm parametric probabilistic wind power forecasts at a large number of locations is presented. Smart grids containing tens, or hundreds, of wind generators require skilled very-short-term forecasts to operate effectively, and spatial information...... is highly desirable. In addition, probabilistic forecasts are widely regarded as necessary for optimal power system management as they quantify the uncertainty associated with point forecasts. Here we work within a parametric framework based on the logit-normal distribution and forecast its parameters....... The location parameter for multiple wind farms is modelled as a vector-valued spatiotemporal process, and the scale parameter is tracked by modified exponential smoothing. A state-of-the-art technique for fitting sparse vector autoregressive models is employed to model the location parameter and demonstrates...

  4. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  5. Probabilist methods applied to electric source problems in nuclear safety

    International Nuclear Information System (INIS)

    Carnino, A.; Llory, M.

    1979-01-01

    Nuclear Safety has frequently been asked to quantify safety margins and evaluate the hazard. In order to do so, the probabilist methods have proved to be the most promising. Without completely replacing determinist safety, they are now commonly used at the reliability or availability stages of systems as well as for determining the likely accidental sequences. In this paper an application linked to the problem of electric sources is described, whilst at the same time indicating the methods used. This is the calculation of the probable loss of all the electric sources of a pressurized water nuclear power station, the evaluation of the reliability of diesels by event trees of failures and the determination of accidental sequences which could be brought about by the 'total electric source loss' initiator and affect the installation or the environment [fr

  6. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  7. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K.

    2013-10-01

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  8. Input to the PRAST computer code used in the SRS probabilistic risk assessment

    International Nuclear Information System (INIS)

    Kearnaghan, D.P.

    1992-01-01

    The PRAST (Production Reactor Algorithm for Source Terms) computer code was developed by Westinghouse Savannah River Company and Science Application International Corporation for the quantification of source terms for the SRS Savannah River Site (SRS) Reactor Probabilistic Risk Assessment. PRAST requires as input a set of release fractions, decontamination factors, transfer fractions and source term characteristics that accurately reflect the conditions that are evaluated by PRAST. This document links the analyses which form the basis for the PRAST input parameters. In addition, it gives the distribution of the input parameters that are uncertain and considered to be important to the evaluation of the source terms to the environment

  9. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.

    Science.gov (United States)

    Mørk, Søren; Holmes, Ian

    2012-03-01

    Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.

  10. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  11. From sub-source to source: Interpreting results of biological trace investigations using probabilistic models

    NARCIS (Netherlands)

    Oosterman, W.T.; Kokshoorn, B.; Maaskant-van Wijk, P.A.; de Zoete, J.

    2015-01-01

    The current method of reporting a putative cell type is based on a non-probabilistic assessment of test results by the forensic practitioner. Additionally, the association between donor and cell type in mixed DNA profiles can be exceedingly complex. We present a probabilistic model for

  12. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  13. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  14. Integration of highly probabilistic sources into optical quantum architectures: perpetual quantum computation

    International Nuclear Information System (INIS)

    Devitt, Simon J; Stephens, Ashley M; Munro, William J; Nemoto, Kae

    2011-01-01

    In this paper, we introduce a design for an optical topological cluster state computer constructed exclusively from a single quantum component. Unlike previous efforts we eliminate the need for on demand, high fidelity photon sources and detectors and replace them with the same device utilized to create photon/photon entanglement. This introduces highly probabilistic elements into the optical architecture while maintaining complete specificity of the structure and operation for a large-scale computer. Photons in this system are continually recycled back into the preparation network, allowing for an arbitrarily deep three-dimensional cluster to be prepared using a comparatively small number of photonic qubits and consequently the elimination of high-frequency, deterministic photon sources.

  15. A reconnaissance assessment of probabilistic earthquake accelerations at the Nevada Test Site

    International Nuclear Information System (INIS)

    Perkins, D.M.; Thenhaus, P.C.; Hanson, S.L.; Algermissen, S.T.

    1986-01-01

    We have made two interim assessments of the probabilistic ground-motion hazard for the potential nuclear-waste disposal facility at the Nevada Test Site (NTS). The first assessment used historical seismicity and generalized source zones and source faults in the immediate vicinity of the facility. This model produced relatively high probabilistic ground motions, comparable to the higher of two earlier estimates, which was obtained by averaging seismicity in a 400-km-radius circle around the site. The high ground-motion values appear to be caused in part by nuclear-explosion aftershocks remaining in the catalog even after the explosions themselves have been removed. The second assessment used particularized source zones and source faults in a region substantially larger than NTS to provide a broad context of probabilistic ground motion estimates at other locations of the study region. Source faults are mapped or inferred faults having lengths of 5 km or more. Source zones are defined by boundaries separating fault groups on the basis of direction and density. For this assessment, earthquake recurrence has been estimated primarily from historic seismicity prior to nuclear testing. Long-term recurrence for large-magnitude events is constrained by geological estimates of recurrence in a regime in which the large-magnitude earthquakes would occur with predominately normal mechanisms. 4 refs., 10 figs

  16. Advanced neutron source reactor probabilistic flow blockage assessment

    International Nuclear Information System (INIS)

    Ramsey, C.T.

    1995-08-01

    The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool

  17. Fission-product source terms

    International Nuclear Information System (INIS)

    Lorenz, R.A.

    1981-01-01

    This presentation consists of a review of fission-product source terms for light water reactor (LWR) fuel. A source term is the quantity of fission products released under specified conditions that can be used to calculate the consequences of the release. The source term usually defines release from breached fuel-rod cladding but could also describe release from the primary coolant system, the reactor containment shell, or the site boundary. The source term would be different for each locality, and the chemical and physical forms of the fission products could also differ

  18. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  19. A probabilistic framework for acoustic emission source localization in plate-like structures

    International Nuclear Information System (INIS)

    Dehghan Niri, E; Salamone, S

    2012-01-01

    This paper proposes a probabilistic approach for acoustic emission (AE) source localization in isotropic plate-like structures based on an extended Kalman filter (EKF). The proposed approach consists of two main stages. During the first stage, time-of-flight (TOF) measurements of Lamb waves are carried out by a continuous wavelet transform (CWT), accounting for systematic errors due to the Heisenberg uncertainty; the second stage uses an EKF to iteratively estimate the AE source location and the wave velocity. The advantages of the proposed algorithm over the traditional methods include the capability of: (1) taking into account uncertainties in TOF measurements and wave velocity and (2) efficiently fusing multi-sensor data to perform AE source localization. The performance of the proposed approach is validated through pencil-lead breaks performed on an aluminum plate at systematic grid locations. The plate was instrumented with an array of four piezoelectric transducers in two different configurations. (paper)

  20. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  1. Dose assessments for Greifswald and Cadarache with new source terms from ITER NSSR-1

    International Nuclear Information System (INIS)

    Raskob, W.; Forschungszentrum Karlsruhe GmbH Technik und Umwelt; Hasemann, I.

    1997-08-01

    Probabilistic dose assessments for accidental atmospheric releases of various ITER source terms which contain tritium and/or activation products were performed for the sites of Greifswald, Germany, and Cadarache, France. No country specific rules were applied and the input parameters were adapted as far as possible to those used within former ITER studies to achieve a better comparability with site independent dose assessments performed in the frame of ITER. The calculations were based on source terms which, at the first time, contain a combination of tritium and activation products. This allowed a better judgement of the contribution of the individual fusion relevant materials to the total dose. The results were compared to site independent dose limits defined in the frame of ITER. Source terms for two different categories, representing 'extremely unlikely events' (CAT-IV) and 'hypothetical sequences' (CAT-V), were investigated. In no cases, the release scenarios of category CAT-IV exceeded the ITER limits. In addition, early doses from the hypothetical scenarios of type CAT-V were still below 50 mSv or 100 mSv, values which are commonly used as lower reference values for evacuation in many potential home countries of ITER. Only the banning of food products was found to be a potential countermeasure which may affect larger areas. (orig.) [de

  2. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  3. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  4. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    International Nuclear Information System (INIS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Brink, Henrik; Crellin-Quick, Arien; Butler, Nathaniel R.

    2012-01-01

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  5. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Brink, Henrik; Crellin-Quick, Arien [Astronomy Department, University of California, Berkeley, CA 94720-3411 (United States); Butler, Nathaniel R., E-mail: jwrichar@stat.berkeley.edu [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287 (United States)

    2012-12-15

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  6. Scenario for a Short-Term Probabilistic Seismic Hazard Assessment (PSHA in Chiayi, Taiwan

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2013-01-01

    Full Text Available Using seismic activity and the Meishan earthquake sequence that occurred from 1904 to 1906, a scenario for short-term probabilistic seismic hazards in the Chiayi region of Taiwan is assessed. The long-term earthquake occurrence rate in Taiwan was evaluated using a smoothing kernel. The highest seismicity rate was calculated around the Chiayi region. To consider earthquake interactions, the rate-and-state friction model was introduced to estimate the seismicity rate evolution due to the Coulomb stress change. As imparted by the 1904 Touliu earthquake, stress changes near the 1906 Meishan and Yangshuigang epicenters was higher than the magnitude of tidal triggering. With regard to the impact of the Meishan earthquake, the region close to the Yangshuigang earthquake epicenter had a +0.75 bar stress increase. The results indicated significant interaction between the three damage events. Considering the path and site effect using ground motion prediction equations, a probabilistic seismic hazard in the form of a hazard evolution and a hazard map was assessed. A significant elevation in hazards following the three earthquakes in the sequence was determined. The results illustrate a possible scenario for seismic hazards in the Chiayi region which may take place repeatly in the future. Such scenario provides essential information on earthquake preparation, devastation estimations, emergency sheltering, utility restoration, and structure reconstruction.

  7. The Analytical Repository Source-Term (AREST) model: Description and documentation

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Apted, M.J.; Engel, D.W.; Altenhofen, M.K.; Strachan, D.M.; Reid, C.R.; Windisch, C.F.; Erikson, R.L.; Johnson, K.I.

    1987-10-01

    The geologic repository system consists of several components, one of which is the engineered barrier system. The engineered barrier system interfaces with natural barriers that constitute the setting of the repository. A model that simulates the releases from the engineered barrier system into the natural barriers of the geosphere, called a source-term model, is an important component of any model for assessing the overall performance of the geologic repository system. The Analytical Repository Source-Term (AREST) model being developed is one such model. This report describes the current state of development of the AREST model and the code in which the model is implemented. The AREST model consists of three component models and five process models that describe the post-emplacement environment of a waste package. All of these components are combined within a probabilistic framework. The component models are a waste package containment (WPC) model that simulates the corrosion and degradation processes which eventually result in waste package containment failure; a waste package release (WPR) model that calculates the rates of radionuclide release from the failed waste package; and an engineered system release (ESR) model that controls the flow of information among all AREST components and process models and combines release output from the WPR model with failure times from the WPC model to produce estimates of total release. 167 refs., 40 figs., 12 tabs

  8. The Multimedia Environmental Pollutant Assessment System (MEPAS)reg-sign: Source-term release formulations

    International Nuclear Information System (INIS)

    Streile, G.P.; Shields, K.D.; Stroh, J.L.; Bagaasen, L.M.; Whelan, G.; McDonald, J.P.; Droppo, J.G.; Buck, J.W.

    1996-11-01

    This report is one of a series of reports that document the mathematical models in the Multimedia Environmental Pollutant Assessment System (MEPAS). Developed by Pacific Northwest National Laboratory for the US Department of Energy, MEPAS is an integrated impact assessment software implementation of physics-based fate and transport models in air, soil, and water media. Outputs are estimates of exposures and health risk assessments for radioactive and hazardous pollutants. Each of the MEPAS formulation documents covers a major MEPAS component such as source-term, atmospheric, vadose zone/groundwater, surface water, and health exposure/health impact assessment. Other MEPAS documentation reports cover the sensitivity/uncertainty formulations and the database parameter constituent property estimation methods. The pollutant source-term release component is documented in this report. MEPAS simulates the release of contaminants from a source, transport through the air, groundwater, surface water, or overland pathways, and transfer through food chains and exposure pathways to the exposed individual or population. For human health impacts, risks are computed for carcinogens and hazard quotients for noncarcinogens. MEPAS is implemented on a desktop computer with a user-friendly interface that allows the user to define the problem, input the required data, and execute the appropriate models for both deterministic and probabilistic analyses

  9. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  10. Very Short-term Nonparametric Probabilistic Forecasting of Renewable Energy Generation - with Application to Solar Energy

    DEFF Research Database (Denmark)

    Golestaneh, Faranak; Pinson, Pierre; Gooi, Hoay Beng

    2016-01-01

    Due to the inherent uncertainty involved in renewable energy forecasting, uncertainty quantification is a key input to maintain acceptable levels of reliability and profitability in power system operation. A proposal is formulated and evaluated here for the case of solar power generation, when only...... approach to generate very short-term predictive densities, i.e., for lead times between a few minutes to one hour ahead, with fast frequency updates. We rely on an Extreme Learning Machine (ELM) as a fast regression model, trained in varied ways to obtain both point and quantile forecasts of solar power...... generation. Four probabilistic methods are implemented as benchmarks. Rival approaches are evaluated based on a number of test cases for two solar power generation sites in different climatic regions, allowing us to show that our approach results in generation of skilful and reliable probabilistic forecasts...

  11. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  12. Design parameters and source terms: Volume 3, Source terms

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 11 refs., 9 tabs

  13. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  14. PARTITION: A program for defining the source term/consequence analysis interface in the NUREG--1150 probabilistic risk assessments

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.; Johnson, J.D.

    1990-05-01

    The individual plant analyses in the US Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident progression analysis, source term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This document has been designed for users of the PARTITION computer program developed by the authors at Sandia National Laboratories for defining the interface between the source term analysis (performed with the XXSOR programs) and the consequence analysis (performed with the MACCS program). This report provides a tutorial that details how the interactive partitioning is performed, along with detailed information on the partitioning process. The PARTITION program was written in ANSI standard FORTRAN 77 to make the code as machine-independent (i.e., portable) as possible. 9 refs., 4 figs

  15. Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...

  16. Variational approach to probabilistic finite elements

    Science.gov (United States)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  17. A linear process-algebraic format for probabilistic systems with data

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark; Gomes, L.; Khomenko, V.; Fernandes, J.M.

    This paper presents a novel linear process algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  18. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    Science.gov (United States)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    method is suitable to compute the valueof the parameter C 2 .When no mathematical model of the source can be made available, estimations of the value C2 can be find in literature.In this paper a probabilistic mathematical representation of the unknown source is proposed, such that the asparagus patch model of the source can be approximated. The computation of the value C2 can be done in conjunction with the CMSA method, knowing the apparent mass of the load and the random acceleration specification at the interface between load and source, respectively.Strength & stiffness design rules for spacecraft, instrumentation, units, etc. will be practiced, as mentioned in ECSS Standards and Handbooks, Launch Vehicle User's manuals, papers, books , etc. A probabilistic description of the design parameters is foreseen.As an example a simple experiment has been worked out.

  19. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  20. Design parameters and source terms: Volume 2, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan---Conceptual Design Report SCP-CDR. The previous study identifies the data needs for the Environmental Assessment effort for seven possible salt repository sites. Volume 2 contains tables of source terms

  1. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    Science.gov (United States)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  2. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  3. Design parameters and source terms: Volume 2, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 2 tabs

  4. Probabilistic Linguistic Power Aggregation Operators for Multi-Criteria Group Decision Making

    Directory of Open Access Journals (Sweden)

    Agbodah Kobina

    2017-12-01

    Full Text Available As an effective aggregation tool, power average (PA allows the input arguments being aggregated to support and reinforce each other, which provides more versatility in the information aggregation process. Under the probabilistic linguistic term environment, we deeply investigate the new power aggregation (PA operators for fusing the probabilistic linguistic term sets (PLTSs. In this paper, we firstly develop the probabilistic linguistic power average (PLPA, the weighted probabilistic linguistic power average (WPLPA operators, the probabilistic linguistic power geometric (PLPG and the weighted probabilistic linguistic power geometric (WPLPG operators. At the same time, we carefully analyze the properties of these new aggregation operators. With the aid of the WPLPA and WPLPG operators, we further design the approaches for the application of multi-criteria group decision-making (MCGDM with PLTSs. Finally, we use an illustrated example to expound our proposed methods and verify their performances.

  5. A global probabilistic tsunami hazard assessment from earthquake sources

    Science.gov (United States)

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  6. Short-term Probabilistic Load Forecasting with the Consideration of Human Body Amenity

    Directory of Open Access Journals (Sweden)

    Ning Lu

    2013-02-01

    Full Text Available Load forecasting is the basis of power system planning and design. It is important for the economic operation and reliability assurance of power system. However, the results of load forecasting given by most existing methods are deterministic. This study aims at probabilistic load forecasting. First, the support vector machine regression is used to acquire the deterministic results of load forecasting with the consideration of human body amenity. Then the probabilistic load forecasting at a certain confidence level is given after the analysis of error distribution law corresponding to certain heat index interval. The final simulation shows that this probabilistic forecasting method is easy to implement and can provide more information than the deterministic forecasting results, and thus is helpful for decision-makers to make reasonable decisions.

  7. PRECIS -- A probabilistic risk assessment system

    International Nuclear Information System (INIS)

    Peterson, D.M.; Knowlton, R.G. Jr.

    1996-01-01

    A series of computer tools has been developed to conduct the exposure assessment and risk characterization phases of human health risk assessments within a probabilistic framework. The tools are collectively referred to as the Probabilistic Risk Evaluation and Characterization Investigation System (PRECIS). With this system, a risk assessor can calculate the doses and risks associated with multiple environmental and exposure pathways, for both chemicals and radioactive contaminants. Exposure assessment models in the system account for transport of contaminants to receptor points from a source zone originating in unsaturated soils above the water table. In addition to performing calculations of dose and risk based on initial concentrations, PRECIS can also be used in an inverse manner to compute soil concentrations in the source area that must not be exceeded if prescribed limits on dose or risk are to be met. Such soil contaminant levels, referred to as soil guidelines, are computed for both single contaminants and chemical mixtures and can be used as action levels or cleanup levels. Probabilistic estimates of risk, dose and soil guidelines are derived using Monte Carlo techniques

  8. Overview of plant specific source terms and their impact on risk

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    2004-01-01

    Probabilistic risk assesment and safety assessment focuses on systems and measures to prevent core meltdown, and it integrates many aspects of design and operation. It provides mapping of initiating event, frequencies onto plant damage state and through plant systems analysis, utilizes fault tree and event tree logic models, may include 'external event' analysis such as fire, flood, wind, seismic events. Percent contribution of sequences to the core damage frequency are shown for the following plants, taken as examples ZION, EDISON, OCONEE 3, SEABROOK, SIZEWELL B, MILLSTONE 3, RINGHALS 2. The presentation includes comparison of the following initiating event frequencies: loss of off-site power; small LOCA; large LOCA, steam generator tube rupture; loss of feedwater; turbine trip; reactor trip. Consequence analysis deals with: dispersion and depletion of radioactivity in the atmosphere, health effects, factors in the off-site emergency plan analyzed with codes that address the weather conditions; provision of mapping of source terms; risk diagram for early fatalities and for latent cancer fatalities

  9. Visualizing Probabilistic Proof

    OpenAIRE

    Guerra-Pujol, Enrique

    2015-01-01

    The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.

  10. An Advanced Bayesian Method for Short-Term Probabilistic Forecasting of the Generation of Wind Power

    Directory of Open Access Journals (Sweden)

    Antonio Bracale

    2015-09-01

    Full Text Available Currently, among renewable distributed generation systems, wind generators are receiving a great deal of interest due to the great economic, technological, and environmental incentives they involve. However, the uncertainties due to the intermittent nature of wind energy make it difficult to operate electrical power systems optimally and make decisions that satisfy the needs of all the stakeholders of the electricity energy market. Thus, there is increasing interest determining how to forecast wind power production accurately. Most the methods that have been published in the relevant literature provided deterministic forecasts even though great interest has been focused recently on probabilistic forecast methods. In this paper, an advanced probabilistic method is proposed for short-term forecasting of wind power production. A mixture of two Weibull distributions was used as a probability function to model the uncertainties associated with wind speed. Then, a Bayesian inference approach with a particularly-effective, autoregressive, integrated, moving-average model was used to determine the parameters of the mixture Weibull distribution. Numerical applications also are presented to provide evidence of the forecasting performance of the Bayesian-based approach.

  11. A linear process-algebraic format for probabilistic systems with data (extended version)

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark

    2010-01-01

    This paper presents a novel linear process-algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  12. A Probabilistic Framework for Security Scenarios with Dependent Actions

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweizer, Patrick; Albert, Elvira; Sekereinsk, Emil

    2014-01-01

    This work addresses the growing need of performing meaningful probabilistic analysis of security. We propose a framework that integrates the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. This allows us to perform

  13. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  14. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  15. Probabilistic assessment of the long-term performance of the Panel Mine tailings area

    International Nuclear Information System (INIS)

    Balins, J.K.; Davis, J.B.; Payne, R.A.

    1994-01-01

    Rio Algom's Panel Uranium Mine originally operated between 1958 and 1961. It was reactivated in 1979 and operated continuously until 1990. In all, the mine produced about 14 million tons of potentially acid generating, low level radioactive uranium tailings; about 5% pyrite (by weight) with less than 0.1% U 3 O 8 . The tailings area consists of two rock rimmed basins. Topographic lows around the perimeter are closed by a total of six containment dams. To minimize the acid generating potential within the tailings, a decommissioning plan to flood the impounded tailings is being implemented. The anticipated performance of engineered structures (dams, spillways, channels, etc.) and the flooded tailings concept, over time periods in the order of thousands of years, have been addressed using probabilistic methods, based on subjective probability distributions consistent with available site specific information. The probable costs associated with long-term inspection and maintenance of the facility, as well as the probable costs and environmental consequences (e.g. tailings releases) associated with potential dam failures due to disruptive events such as floods, droughts and earthquakes were determined using a probabilistic model which consists of five, essentially independent, sub-models: a Maintenance Model, an Earthquake Response Model, a Flood Response Model, a Drought Model and an Integration Model. The principal conclusion derived from this assessment is that, for a well designed, constructed and maintained facility, there is very little likelihood that water and/or tailings solids will be released as a result of a containment dam failure; annual probability of the order of 10 -6 . Failure to maintain the facility over the long-term significantly increases the likelihood of dam failure with resultant release of water and suspended tailings solids

  16. The NUREG-1150 probabilistic risk assessment for the Grand Gulf nuclear station

    International Nuclear Information System (INIS)

    Brown, T.D.; Breeding, R.J.; Jow, H.N.; Higgins, S.J.; Shiver, A.W.; Helton, J.C.

    1992-01-01

    This paper summarizes the findings of the probabilistic risk assessment (PRA) for Unit 1 of the Grand Gulf Nuclear Station performed in support of NUREG-1150. The emphasis is on the 'back-end' analyses, that is, the acident progression, source term, consequence analsyes, and risk results obtained when the results of these analyses are combined with the accident frequency analysis. The offsite risk from internal initiating events was found to be quite low, both with respect to the safety goals and to the other plants analyzed in NUREG-1150. The offsite risk is dominated by short-term station blackout plant damage states. The long-term blackout group and the anticiptated transients without scram (ATWS) group contribute considerably less to risk. Transients in which the power conversion system is unavailable are very minor contributors to risk. The low values for risk can be attributed to low core damage frequency, good emergency response, and plant features that reduce the potential source term. (orig.)

  17. A Probabilistic Short-Term Water Demand Forecasting Model Based on the Markov Chain

    Directory of Open Access Journals (Sweden)

    Francesca Gagliardi

    2017-07-01

    Full Text Available This paper proposes a short-term water demand forecasting method based on the use of the Markov chain. This method provides estimates of future demands by calculating probabilities that the future demand value will fall within pre-assigned intervals covering the expected total variability. More specifically, two models based on homogeneous and non-homogeneous Markov chains were developed and presented. These models, together with two benchmark models (based on artificial neural network and naïve methods, were applied to three real-life case studies for the purpose of forecasting the respective water demands from 1 to 24 h ahead. The results obtained show that the model based on a homogeneous Markov chain provides more accurate short-term forecasts than the one based on a non-homogeneous Markov chain, which is in line with the artificial neural network model. Both Markov chain models enable probabilistic information regarding the stochastic demand forecast to be easily obtained.

  18. An analog ensemble for short-term probabilistic solar power forecast

    International Nuclear Information System (INIS)

    Alessandrini, S.; Delle Monache, L.; Sperati, S.; Cervone, G.

    2015-01-01

    Highlights: • A novel method for solar power probabilistic forecasting is proposed. • The forecast accuracy does not depend on the nominal power. • The impact of climatology on forecast accuracy is evaluated. - Abstract: The energy produced by photovoltaic farms has a variable nature depending on astronomical and meteorological factors. The former are the solar elevation and the solar azimuth, which are easily predictable without any uncertainty. The amount of liquid water met by the solar radiation within the troposphere is the main meteorological factor influencing the solar power production, as a fraction of short wave solar radiation is reflected by the water particles and cannot reach the earth surface. The total cloud cover is a meteorological variable often used to indicate the presence of liquid water in the troposphere and has a limited predictability, which is also reflected on the global horizontal irradiance and, as a consequence, on solar photovoltaic power prediction. This lack of predictability makes the solar energy integration into the grid challenging. A cost-effective utilization of solar energy over a grid strongly depends on the accuracy and reliability of the power forecasts available to the Transmission System Operators (TSOs). Furthermore, several countries have in place legislation requiring solar power producers to pay penalties proportional to the errors of day-ahead energy forecasts, which makes the accuracy of such predictions a determining factor for producers to reduce their economic losses. Probabilistic predictions can provide accurate deterministic forecasts along with a quantification of their uncertainty, as well as a reliable estimate of the probability to overcome a certain production threshold. In this paper we propose the application of an analog ensemble (AnEn) method to generate probabilistic solar power forecasts (SPF). The AnEn is based on an historical set of deterministic numerical weather prediction (NWP) model

  19. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Madankan, R. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pouget, S. [Department of Geology, University at Buffalo (United States); Singla, P., E-mail: psingla@buffalo.edu [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Bursik, M. [Department of Geology, University at Buffalo (United States); Dehn, J. [Geophysical Institute, University of Alaska, Fairbanks (United States); Jones, M. [Center for Computational Research, University at Buffalo (United States); Patra, A. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pavolonis, M. [NOAA-NESDIS, Center for Satellite Applications and Research (United States); Pitman, E.B. [Department of Mathematics, University at Buffalo (United States); Singh, T. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Webley, P. [Geophysical Institute, University of Alaska, Fairbanks (United States)

    2014-08-15

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This paper presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.

  20. The Role of Language in Building Probabilistic Thinking

    Science.gov (United States)

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  1. Probabilistic risk assessment support of emergency preparedness at the Savannah River Site

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Baker, W.H.; Simpkins, A.A.; Taylor, R.P.; Wagner, K.C.; Amos, C.N.

    1992-01-01

    Integration of the Probabilistic Risk Assessment (PRA) for K Reactor operation into related technical areas at the Savannah River Site (SRS) includes coordination with several onsite organizations responsible for maintaining and upgrading emergency preparedness capabilities. Major functional categories of the PRA application are scenario development and source term algorithm enhancement. Insights and technologies from the SRS PRA have facilitated development of: (1) credible timelines for scenarios; (2) algorithms tied to plant instrumentation to provide best-estimate source terms for dose projection; and (3) expert-system logic models to implement informed counter-measures to assure onsite and offsite safety following accidental releases. The latter methodology, in particular, is readily transferable to other reactor and non-reactor facilities at SRS and represents a distinct advance relative to emergency preparedness capabilities elsewhere in the DOE complex

  2. Probabilistic safety assessment framework of pebble-bed modular high-temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Liu Tao; Tong Jiejuan; Zhao Jun; Cao Jianzhu; Zhang Liguo

    2009-01-01

    After an investigation of similar reactor type probabilistic safety assessment (PSA) framework, Pebble-bed Modular High-Temperature Gas-cooled Reactor (HTR-PM) PSA framework was presented in correlate with its own design characteristics. That is an integral framework which spreads through event sequence structure with initiating events at the beginning and source term categories in the end. The analysis shows that it is HTR-PM design feature that determines its PSA framework. (authors)

  3. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    Science.gov (United States)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly

  4. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  5. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  6. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  7. SOURCE TERMS FOR HLW GLASS CANISTERS

    International Nuclear Information System (INIS)

    J.S. Tang

    2000-01-01

    This calculation is prepared by the Monitored Geologic Repository (MGR) Waste Package Design Section. The objective of this calculation is to determine the source terms that include radionuclide inventory, decay heat, and radiation sources due to gamma rays and neutrons for the high-level radioactive waste (HLW) from the, West Valley Demonstration Project (WVDP), Savannah River Site (SRS), Hanford Site (HS), and Idaho National Engineering and Environmental Laboratory (INEEL). This calculation also determines the source terms of the canister containing the SRS HLW glass and immobilized plutonium. The scope of this calculation is limited to source terms for a time period out to one million years. The results of this calculation may be used to carry out performance assessment of the potential repository and to evaluate radiation environments surrounding the waste packages (WPs). This calculation was performed in accordance with the Development Plan ''Source Terms for HLW Glass Canisters'' (Ref. 7.24)

  8. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  9. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  10. 10 CFR 50.67 - Accident source term.

    Science.gov (United States)

    2010-01-01

    ... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... to January 10, 1997, who seek to revise the current accident source term used in their design basis...

  11. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    Science.gov (United States)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  12. Procedures for conducting probabilistic safety assessments of nuclear power plants (level 2). Accident progression, containment analysis and estimation of accident source terms

    International Nuclear Information System (INIS)

    1995-01-01

    The present publication on Level 2 PSA is based on a compilation and review of practices in various Member States. It complements Safety Series No. 50-P-4, issued in 1992, on Procedures for Conducting Probabilistic Safety Assessments of Nuclear Power Plants (Level 1). Refs, figs and tabs

  13. Validation of in vitro probabilistic tractography

    DEFF Research Database (Denmark)

    Dyrby, Tim B.; Sogaard, L.V.; Parker, G.J.

    2007-01-01

    assessed the anatomical validity and reproducibility of in vitro multi-fiber probabilistic tractography against two invasive tracers: the histochemically detectable biotinylated dextran amine and manganese enhanced magnetic resonance imaging. Post mortern DWI was used to ensure that most of the sources...

  14. Performance assessment of deterministic and probabilistic weather predictions for the short-term optimization of a tropical hydropower reservoir

    Science.gov (United States)

    Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter

    2016-04-01

    Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of

  15. Calculation of source terms for NUREG-1150

    International Nuclear Information System (INIS)

    Breeding, R.J.; Williams, D.C.; Murfin, W.B.; Amos, C.N.; Helton, J.C.

    1987-10-01

    The source terms estimated for NUREG-1150 are generally based on the Source Term Code Package (STCP), but the actual source term calculations used in computing risk are performed by much smaller codes which are specific to each plant. This was done because the method of estimating the uncertainty in risk for NUREG-1150 requires hundreds of source term calculations for each accident sequence. This is clearly impossible with a large, detailed code like the STCP. The small plant-specific codes are based on simple algorithms and utilize adjustable parameters. The values of the parameters appearing in these codes are derived from the available STCP results. To determine the uncertainty in the estimation of the source terms, these parameters were varied as specified by an expert review group. This method was used to account for the uncertainties in the STCP results and the uncertainties in phenomena not considered by the STCP

  16. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  17. Bayesian assignment of gene ontology terms to gene expression experiments

    Science.gov (United States)

    Sykacek, P.

    2012-01-01

    Motivation: Gene expression assays allow for genome scale analyses of molecular biological mechanisms. State-of-the-art data analysis provides lists of involved genes, either by calculating significance levels of mRNA abundance or by Bayesian assessments of gene activity. A common problem of such approaches is the difficulty of interpreting the biological implication of the resulting gene lists. This lead to an increased interest in methods for inferring high-level biological information. A common approach for representing high level information is by inferring gene ontology (GO) terms which may be attributed to the expression data experiment. Results: This article proposes a probabilistic model for GO term inference. Modelling assumes that gene annotations to GO terms are available and gene involvement in an experiment is represented by a posterior probabilities over gene-specific indicator variables. Such probability measures result from many Bayesian approaches for expression data analysis. The proposed model combines these indicator probabilities in a probabilistic fashion and provides a probabilistic GO term assignment as a result. Experiments on synthetic and microarray data suggest that advantages of the proposed probabilistic GO term inference over statistical test-based approaches are in particular evident for sparsely annotated GO terms and in situations of large uncertainty about gene activity. Provided that appropriate annotations exist, the proposed approach is easily applied to inferring other high level assignments like pathways. Availability: Source code under GPL license is available from the author. Contact: peter.sykacek@boku.ac.at PMID:22962488

  18. Bayesian assignment of gene ontology terms to gene expression experiments.

    Science.gov (United States)

    Sykacek, P

    2012-09-15

    Gene expression assays allow for genome scale analyses of molecular biological mechanisms. State-of-the-art data analysis provides lists of involved genes, either by calculating significance levels of mRNA abundance or by Bayesian assessments of gene activity. A common problem of such approaches is the difficulty of interpreting the biological implication of the resulting gene lists. This lead to an increased interest in methods for inferring high-level biological information. A common approach for representing high level information is by inferring gene ontology (GO) terms which may be attributed to the expression data experiment. This article proposes a probabilistic model for GO term inference. Modelling assumes that gene annotations to GO terms are available and gene involvement in an experiment is represented by a posterior probabilities over gene-specific indicator variables. Such probability measures result from many Bayesian approaches for expression data analysis. The proposed model combines these indicator probabilities in a probabilistic fashion and provides a probabilistic GO term assignment as a result. Experiments on synthetic and microarray data suggest that advantages of the proposed probabilistic GO term inference over statistical test-based approaches are in particular evident for sparsely annotated GO terms and in situations of large uncertainty about gene activity. Provided that appropriate annotations exist, the proposed approach is easily applied to inferring other high level assignments like pathways. Source code under GPL license is available from the author. peter.sykacek@boku.ac.at.

  19. Probabilistic inversion for chicken processing lines

    International Nuclear Information System (INIS)

    Cooke, Roger M.; Nauta, Maarten; Havelaar, Arie H.; Fels, Ine van der

    2006-01-01

    We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism

  20. Subsurface Shielding Source Term Specification Calculation

    International Nuclear Information System (INIS)

    S.Su

    2001-01-01

    The purpose of this calculation is to establish appropriate and defensible waste-package radiation source terms for use in repository subsurface shielding design. This calculation supports the shielding design for the waste emplacement and retrieval system, and subsurface facility system. The objective is to identify the limiting waste package and specify its associated source terms including source strengths and energy spectra. Consistent with the Technical Work Plan for Subsurface Design Section FY 01 Work Activities (CRWMS M and O 2001, p. 15), the scope of work includes the following: (1) Review source terms generated by the Waste Package Department (WPD) for various waste forms and waste package types, and compile them for shielding-specific applications. (2) Determine acceptable waste package specific source terms for use in subsurface shielding design, using a reasonable and defensible methodology that is not unduly conservative. This calculation is associated with the engineering and design activity for the waste emplacement and retrieval system, and subsurface facility system. The technical work plan for this calculation is provided in CRWMS M and O 2001. Development and performance of this calculation conforms to the procedure, AP-3.12Q, Calculations

  1. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  2. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    Science.gov (United States)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m

  3. ITER Safety Task NID-5A, Subtask 1-1: Source terms and energies - initial tritium source terms. Final report

    International Nuclear Information System (INIS)

    Fong, C.; Kalyanam, K.M.; Tanaka, M.R.; Sood, S.; Natalizio, A.; Delisle, M.

    1995-02-01

    The overall objective of the Early Safety and Environmental Characterization Study (ESECS) is to assess the environmental impact of tritium using appropriate assumptions on a hypothetical site for ITER, having the r eference s ite characteristics as proposed by the JCT. The objective of this work under the above subtask 1-1, NID-5a, is to determine environmental source terms (i.e., process source term x containment release fraction) for the fuel cycle and cooling systems. The work is based on inventories and process source terms (i.e., inventory x mobilization fraction), provided by others (under Task NID 3b). The results of this work form the basis for the determination, by others, of the off-site dose (i.e., environmental source term x dose/release ratio). For the determination of the environmental source terms, the TMAP4 code has been utilized (ref 1). This code is approved by ITER for safety assessment. Volume 3 is a compilation of appendices giving detailed results of the study

  4. ITER Safety Task NID-5A, Subtask 1-1: Source terms and energies - initial tritium source terms. Final report

    International Nuclear Information System (INIS)

    Fong, C.; Kalyanam, K.M.; Tanaka, M.R.; Sood, S.; Natalizio, A.; Delisle, M.

    1995-02-01

    The overall objective of the Early Safety and Environmental Characterization Study (ESECS) is to assess the environmental impact of tritium using appropriate assumptions on a hypothetical site for ITER, having the r eference s ite characteristics as proposed by the JCT. The objective of this work under the above subtask 1-1, NID-5a, is to determine environmental source terms (i.e., process source term x containment release fraction) for the fuel cycle and cooling systems. The work is based on inventories and process source terms (i.e., inventory x mobilization fraction), provided by others (under Task NID 3b). The results of this work form the basis for the determination, by others, of the off-site dose (i.e., environmental source term x dose/release ratio). For the determination of the environmental source terms, the TMAP4 code has been utilized (ref 1). This code is approved by ITER for safety assessment. 6 refs

  5. Probabilistic tsunami hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau (Canada); Alcinov, T.; Roussel, P.; Lavine, A.; Arcos, M.E.M.; Hanson, K.; Youngs, R., E-mail: trajce.alcinov@amecfw.com, E-mail: patrick.roussel@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure, Dartmouth, NS (Canada)

    2015-07-01

    In 2012 the Geological Survey of Canada published a preliminary probabilistic tsunami hazard assessment in Open File 7201 that presents the most up-to-date information on all potential tsunami sources in a probabilistic framework on a national level, thus providing the underlying basis for conducting site-specific tsunami hazard assessments. However, the assessment identified a poorly constrained hazard for the Atlantic Coastline and recommended further evaluation. As a result, NB Power has embarked on performing a Probabilistic Tsunami Hazard Assessment (PTHA) for Point Lepreau Generating Station. This paper provides the methodology and progress or hazard evaluation results for Point Lepreau G.S. (author)

  6. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    Energy Technology Data Exchange (ETDEWEB)

    Yee, Eric [KEPCO International Nuclear Graduate School, Dept. of Nuclear Power Plant Engineering, Ulsan (Korea, Republic of)

    2017-03-15

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered.

  7. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    International Nuclear Information System (INIS)

    Yee, Eric

    2017-01-01

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered

  8. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    International Nuclear Information System (INIS)

    Lee, Gi Hwa

    1997-11-01

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances

  9. Dose assessments for Greifswald and Cadarache with updated source terms from ITER NSSR-2

    International Nuclear Information System (INIS)

    Raskob, W.; Hasemann, I.

    1998-08-01

    The International Thermonuclear Experimental Reactor ITER is in its late engineering phase. One of the most important safety aspects - in particular for achieving public acceptance - is to assure that the releases of harzardous material are minimal during normal operation and for accidental events, even if very unlikely. To this purpose probabilistic dose assessments for accidental atmospheric releases of various ITER source terms which contain tritium and/or activation products were performed for the sites of Greifswald, Germany, and Cadarache, France. In addition, routine releases into the atmosphere and hydrosphere have been evaluated. No country specific rules were applied and the input parameters were adapted as far as possible to those used within former studies to achieve a better comparability with site independent dose assessments performed in the frame of ITER. The calculations were based on source terms which, for the first time, contain a combination of tritium and activation products. This allowed a better judgment of the contribution to the total dose of the individual fusion relevant materials. The results were compared to site independent dose limits defined in the frame of ITER. Annual doses from routine releases (CAT-I) are below 0.1 μSv for the aquatic scenarios and are close to 1 μSv for the atmospheric source terms. Source terms for two different categories of accidental releases, representing 'extremely unlikely events' (CAT-IV) and 'hypothetical sequences' (CAT-V), were investigated. In none of these cases, the release scenarios of category CAT-IV exceed the ITER limits. In addition, relevant characteristic quantities of the early dose distribution from the hypothetical scenarios of type CAT-V are still below 50 mSv or 100 mSv, values which are commonly used as lower reference values for evacuation in many potential home countries of ITER. These site specific assessments confirmed that the proposed release limits and thus the derived dose

  10. ITER Safety Task NID-5A, Subtask 1-1: Source terms and energies - initial tritium source terms. Final report

    International Nuclear Information System (INIS)

    Fong, C.; Kalyanam, K.M.; Tanaka, M.R.; Sood, S.; Natalizio, A.; Delisle, M.

    1995-02-01

    The overall objective of the Early Safety and Environmental Characterization Study (ESECS) is to assess the environmental impact of tritium using appropriate assumptions on a hypothetical site for ITER, having the r eference s ite characteristics as proposed by the JCT. The objective of this work under the above subtask 1-1, NID-5a, is to determine environmental source terms (i.e., process source term x containment release fraction) for the fuel cycle and cooling systems. The work is based on inventories and process source terms (i.e., inventory x mobilization fraction), provided by others (under Task NID 3b). The results of this work form the basis for the determination, by others, of the off-site dose (i.e., environmental source term x dose/release ratio). For the determination of the environmental source terms, the TMAP4 code has been utilized (ref 1). This code is approved by ITER for safety assessment. Volume 2 is a compilation of appendices giving detailed results of the study. 5 figs

  11. Aerosol behavior and light water reactor source terms

    International Nuclear Information System (INIS)

    Abbey, F.; Schikarski, W.O.

    1988-01-01

    The major developments in nuclear aerosol modeling following the accident to pressurized water reactor Unit 2 at Three Mile Island are briefly reviewed and the state of the art summarized. The importance and implications of these developments for severe accident source terms for light water reactors are then discussed in general terms. The treatment is not aimed at identifying specific source term values but is intended rather to illustrate trends, to assess the adequacy of the understanding of major aspects of aerosol behavior for source term prediction, and demonstrate in qualitative terms the effect of various aspects of reactor design. Areas where improved understanding of aerosol behavior might lead to further reductions in current source terms predictions are also considered

  12. Probabilistic solution of the Dirac equation

    International Nuclear Information System (INIS)

    Blanchard, P.; Combe, P.

    1985-01-01

    Various probabilistic representations of the 2, 3 and 4 dimensional Dirac equation are given in terms of expectation with respect to stochastic jump processes and are used to derive the nonrelativistic limit even in the presence of an external electromagnetic field. (orig.)

  13. Convolution product construction of interactions in probabilistic physical models

    International Nuclear Information System (INIS)

    Ratsimbarison, H.M.; Raboanary, R.

    2007-01-01

    This paper aims to give a probabilistic construction of interactions which may be relevant for building physical theories such as interacting quantum field theories. We start with the path integral definition of partition function in quantum field theory which recall us the probabilistic nature of this physical theory. From a Gaussian law considered as free theory, an interacting theory is constructed by nontrivial convolution product between the free theory and an interacting term which is also a probability law. The resulting theory, again a probability law, exhibits two proprieties already present in nowadays theories of interactions such as Gauge theory : the interaction term does not depend on the free term, and two different free theories can be implemented with the same interaction.

  14. Reachability Analysis in Probabilistic Biological Networks.

    Science.gov (United States)

    Gabr, Haitham; Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2015-01-01

    Extra-cellular molecules trigger a response inside the cell by initiating a signal at special membrane receptors (i.e., sources), which is then transmitted to reporters (i.e., targets) through various chains of interactions among proteins. Understanding whether such a signal can reach from membrane receptors to reporters is essential in studying the cell response to extra-cellular events. This problem is drastically complicated due to the unreliability of the interaction data. In this paper, we develop a novel method, called PReach (Probabilistic Reachability), that precisely computes the probability that a signal can reach from a given collection of receptors to a given collection of reporters when the underlying signaling network is uncertain. This is a very difficult computational problem with no known polynomial-time solution. PReach represents each uncertain interaction as a bi-variate polynomial. It transforms the reachability problem to a polynomial multiplication problem. We introduce novel polynomial collapsing operators that associate polynomial terms with possible paths between sources and targets as well as the cuts that separate sources from targets. These operators significantly shrink the number of polynomial terms and thus the running time. PReach has much better time complexity than the recent solutions for this problem. Our experimental results on real data sets demonstrate that this improvement leads to orders of magnitude of reduction in the running time over the most recent methods. Availability: All the data sets used, the software implemented and the alignments found in this paper are available at http://bioinformatics.cise.ufl.edu/PReach/.

  15. Global Infrasound Association Based on Probabilistic Clutter Categorization

    Science.gov (United States)

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  16. The European source-term evaluation code ASTEC: status and applications, including CANDU plant applications

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Giordano, P.; Kissane, M.P.; Montanelli, T.; Schwinges, B.; Ganju, S.; Dickson, L.

    2004-01-01

    Research on light-water reactor severe accidents (SA) is still required in a limited number of areas in order to confirm accident-management plans. Thus, 49 European organizations have linked their SA research in a durable way through SARNET (Severe Accident Research and management NETwork), part of the European 6th Framework Programme. One goal of SARNET is to consolidate the integral code ASTEC (Accident Source Term Evaluation Code, developed by IRSN and GRS) as the European reference tool for safety studies; SARNET efforts include extending the application scope to reactor types other than PWR (including VVER) such as BWR and CANDU. ASTEC is used in IRSN's Probabilistic Safety Analysis level 2 of 900 MWe French PWRs. An earlier version of ASTEC's SOPHAEROS module, including improvements by AECL, is being validated as the Canadian Industry Standard Toolset code for FP-transport analysis in the CANDU Heat Transport System. Work with ASTEC has also been performed by Bhabha Atomic Research Centre, Mumbai, on IPHWR containment thermal hydraulics. (author)

  17. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  18. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review.

    Science.gov (United States)

    McClelland, James L

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.

  19. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  20. Probabilistic Unawareness

    Directory of Open Access Journals (Sweden)

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  1. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  2. Basic design of parallel computational program for probabilistic structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  3. Revised accident source terms for light-water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Soffer, L. [Nuclear Regulatory Commission, Washington, DC (United States)

    1995-02-01

    This paper presents revised accident source terms for light-water reactors incorporating the severe accident research insights gained in this area over the last 15 years. Current LWR reactor accident source terms used for licensing date from 1962 and are contained in Regulatory Guides 1.3 and 1.4. These specify that 100% of the core inventory of noble gases and 25% of the iodine fission products are assumed to be instantaneously available for release from the containment. The chemical form of the iodine fission products is also assumed to be predominantly elemental iodine. These assumptions have strongly affected present nuclear air cleaning requirements by emphasizing rapid actuation of spray systems and filtration systems optimized to retain elemental iodine. A proposed revision of reactor accident source terms and some im implications for nuclear air cleaning requirements was presented at the 22nd DOE/NRC Nuclear Air Cleaning Conference. A draft report was issued by the NRC for comment in July 1992. Extensive comments were received, with the most significant comments involving (a) release fractions for both volatile and non-volatile species in the early in-vessel release phase, (b) gap release fractions of the noble gases, iodine and cesium, and (c) the timing and duration for the release phases. The final source term report is expected to be issued in late 1994. Although the revised source terms are intended primarily for future plants, current nuclear power plants may request use of revised accident source term insights as well in licensing. This paper emphasizes additional information obtained since the 22nd Conference, including studies on fission product removal mechanisms, results obtained from improved severe accident code calculations and resolution of major comments, and their impact upon the revised accident source terms. Revised accident source terms for both BWRS and PWRS are presented.

  4. Revised accident source terms and control room habitability

    International Nuclear Information System (INIS)

    Lahti, G.P.; Hubner, R.S.; Johnson, W.J.; Schwartz, B.C.

    1993-01-01

    In April 1992, the NRC staff presented to the Commissioners the draft NUREG open-quotes Revised Accident Source Terms for Light-Water Nuclear Power Plants.close quotes This document is the culmination of more than ten years of NRC-sponsored research and represents the first change in the NRC's position on source terms since TID-14844 was issued in 1962. The purpose of this paper is to investigate the impact of the revised source terms on the current approach to analyzing control room habitability as required by 10 CFR 50. Sample calculations are presented that identify aspects of the model requiring clarification before the implementation of the revised source terms. 6 refs., 4 tabs

  5. Application of a probabilistic model of rainfall-induced shallow landslides to complex hollows

    NARCIS (Netherlands)

    Talebi, A.; Uijlenhoet, R.; Troch, P.A.

    2008-01-01

    Recently, D'Odorico and Fagherazzi (2003) proposed "A probabilistic model of rainfall-triggered shallow landslides in hollows" (Water Resour. Res., 39, 2003). Their model describes the long-term evolution of colluvial deposits through a probabilistic soil mass balance at a point. Further building

  6. Qualitative uncertainty analysis in probabilistic safety assessment context

    International Nuclear Information System (INIS)

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  7. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  8. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  9. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  10. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  11. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    Science.gov (United States)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  12. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  13. Probabilistic modelling and analysis of stand-alone hybrid power systems

    International Nuclear Information System (INIS)

    Lujano-Rojas, Juan M.; Dufo-López, Rodolfo; Bernal-Agustín, José L.

    2013-01-01

    As a part of the Hybrid Intelligent Algorithm, a model based on an ANN (artificial neural network) has been proposed in this paper to represent hybrid system behaviour considering the uncertainty related to wind speed and solar radiation, battery bank lifetime, and fuel prices. The Hybrid Intelligent Algorithm suggests a combination of probabilistic analysis based on a Monte Carlo simulation approach and artificial neural network training embedded in a genetic algorithm optimisation model. The installation of a typical hybrid system was analysed. Probabilistic analysis was used to generate an input–output dataset of 519 samples that was later used to train the ANNs to reduce the computational effort required. The generalisation ability of the ANNs was measured in terms of RMSE (Root Mean Square Error), MBE (Mean Bias Error), MAE (Mean Absolute Error), and R-squared estimators using another data group of 200 samples. The results obtained from the estimation of the expected energy not supplied, the probability of a determined reliability level, and the estimation of expected value of net present cost show that the presented model is able to represent the main characteristics of a typical hybrid power system under uncertain operating conditions. - Highlights: • This paper presents a probabilistic model for stand-alone hybrid power system. • The model considers the main sources of uncertainty related to renewable resources. • The Hybrid Intelligent Algorithm has been applied to represent hybrid system behaviour. • The installation of a typical hybrid system was analysed. • The results obtained from the study case validate the presented model

  14. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  15. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  16. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  17. Case studies in the application of probabilistic safety assessment techniques to radiation sources. Final report of a coordinated research project 2001-2003

    International Nuclear Information System (INIS)

    2006-04-01

    Radiation sources are used worldwide in many industrial and medical applications. In general, the safety record associated with their use has been very good. However, accidents involving these sources have occasionally resulted in unplanned exposures to individuals. When assessed prospectively, this type of exposure is termed a 'potential exposure'. The International Commission on Radiological Protection (ICRP) has recommended the assessment of potential exposures that may result from radiation sources and has suggested that probabilistic safety assessment (PSA) techniques may be used in this process. Also, Paragraph 2.13 of the International Basic Safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources (BSS) requires that the authorization process for radiation sources include an assessment of all exposures, including potential exposures, which may result from the use of a radiation source. In light of the ICRP's work described above, and the possibility that PSA techniques could be used in exposure assessments that are required by the BSS, the IAEA initiated a coordinated research project (CRP) to study the benefits and limitations of the application of PSA techniques to radiation sources. The results of this CRP are presented in this publication. It should be noted that these results are based solely on the work performed, and the conclusions drawn, by the research teams involved in this CRP. It is intended that international organizations involved in radiation protection will review the information in this report and will take account of it during the development of guidance and requirements related to the assessment of potential exposures from radiation sources. Also, it is anticipated that the risk insights obtained through the studies will be considered by medical practitioners, facility staff and management, equipment designers, and regulators in their safety management and risk evaluation activities. A draft

  18. Chernobyl source term estimation

    International Nuclear Information System (INIS)

    Gudiksen, P.H.; Harvey, T.F.; Lange, R.

    1990-09-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. The model simulations revealed that the radioactive cloud became segmented during the first day, with the lower section heading toward Scandinavia and the upper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. By optimizing the agreement between the observed cloud arrival times and duration of peak concentrations measured over Europe, Japan, Kuwait, and the US with the model predicted concentrations, it was possible to derive source term estimates for those radionuclides measured in airborne radioactivity. This was extended to radionuclides that were largely unmeasured in the environment by performing a reactor core radionuclide inventory analysis to obtain release fractions for the various chemical transport groups. These analyses indicated that essentially all of the noble gases, 60% of the radioiodines, 40% of the radiocesium, 10% of the tellurium and about 1% or less of the more refractory elements were released. These estimates are in excellent agreement with those obtained on the basis of worldwide deposition measurements. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents. However, the 137 Cs from the Chernobyl event is about 6% of that released by the US and USSR atmospheric nuclear weapon tests, while the 131 I and 90 Sr released by the Chernobyl accident was only about 0.1% of that released by the weapon tests. 13 refs., 2 figs., 7 tabs

  19. Source term and radiological consequences of the Chernobyl accident

    International Nuclear Information System (INIS)

    Mourad, R.

    1987-09-01

    This report presents the results of a study of the source term and radiological consequences of the Chernobyl accident. The results two parts. The first part was performed during the first 2 months following the accident and dealt with the evaluation of the source term and an estimate of individual doses in the European countries outside the Soviet Union. The second part was performed after August 25-29, 1986 when the Soviets presented in a IAEA Conference in Vienna detailed information about the accident, including source term and radiological consequences in the Soviet Union. The second part of the study reconfirms the source term evaluated in the first part and in addition deals with the radiological consequences in the Soviet Union. Source term and individual doses are calculated from measured post-accident data, reported by the Soviet Union and European countries, microcomputer program PEAR (Public Exposure from Accident Releases). 22 refs

  20. Review of the Diablo Canyon probabilistic risk assessment

    International Nuclear Information System (INIS)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.; Sabek, M.G.; Ravindra, M.K.; Johnson, J.J.

    1994-08-01

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Term Seismic Program

  1. Phase 1 immobilized low-activity waste operational source term

    International Nuclear Information System (INIS)

    Burbank, D.A.

    1998-01-01

    This report presents an engineering analysis of the Phase 1 privatization feeds to establish an operational source term for storage and disposal of immobilized low-activity waste packages at the Hanford Site. The source term information is needed to establish a preliminary estimate of the numbers of remote-handled and contact-handled waste packages. A discussion of the uncertainties and their impact on the source term and waste package distribution is also presented. It should be noted that this study is concerned with operational impacts only. Source terms used for accident scenarios would differ due to alpha and beta radiation which were not significant in this study

  2. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  3. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  4. Deterministic and probabilistic interval prediction for short-term wind power generation based on variational mode decomposition and machine learning methods

    International Nuclear Information System (INIS)

    Zhang, Yachao; Liu, Kaipei; Qin, Liang; An, Xueli

    2016-01-01

    Highlights: • Variational mode decomposition is adopted to process original wind power series. • A novel combined model based on machine learning methods is established. • An improved differential evolution algorithm is proposed for weight adjustment. • Probabilistic interval prediction is performed by quantile regression averaging. - Abstract: Due to the increasingly significant energy crisis nowadays, the exploitation and utilization of new clean energy gains more and more attention. As an important category of renewable energy, wind power generation has become the most rapidly growing renewable energy in China. However, the intermittency and volatility of wind power has restricted the large-scale integration of wind turbines into power systems. High-precision wind power forecasting is an effective measure to alleviate the negative influence of wind power generation on the power systems. In this paper, a novel combined model is proposed to improve the prediction performance for the short-term wind power forecasting. Variational mode decomposition is firstly adopted to handle the instability of the raw wind power series, and the subseries can be reconstructed by measuring sample entropy of the decomposed modes. Then the base models can be established for each subseries respectively. On this basis, the combined model is developed based on the optimal virtual prediction scheme, the weight matrix of which is dynamically adjusted by a self-adaptive multi-strategy differential evolution algorithm. Besides, a probabilistic interval prediction model based on quantile regression averaging and variational mode decomposition-based hybrid models is presented to quantify the potential risks of the wind power series. The simulation results indicate that: (1) the normalized mean absolute errors of the proposed combined model from one-step to three-step forecasting are 4.34%, 6.49% and 7.76%, respectively, which are much lower than those of the base models and the hybrid

  5. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  6. Source term analyses under severe accidents for KNGR

    Energy Technology Data Exchange (ETDEWEB)

    Song, Yong Mann; Park, Soo Yong

    2001-03-01

    In this study, in-containment source term for LOFW (Loss of Feed Water), which has appeared the most frequent core melt accident, is calculated and compared with NUREG-1465 source term. This study provides not only new source term data using MELCOR1.8.4 and its state-of-the-art models but also evaluating basis of KNGR design and its mitigation capability under severe accidents. As the selected accident is identical with LOFW-S17, which has been analyzed using MAAP by KEPCO with only difference of 2 SITs, mutual comparison of the results is especially expected.

  7. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  8. Poisson-Like Spiking in Circuits with Probabilistic Synapses

    Science.gov (United States)

    Moreno-Bote, Rubén

    2014-01-01

    Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705

  9. A simple method for estimating potential source term bypass fractions from confinement structures

    International Nuclear Information System (INIS)

    Kalinich, D.A.; Paddleford, D.F.

    1997-01-01

    Confinement structures house many of the operating processes at the Savannah River Site (SRS). Under normal operating conditions, a confinement structure in conjunction with its associated ventilation systems prevents the release of radiological material to the environment. However, under potential accident conditions, the performance of the ventilation systems and integrity of the structure may be challenged. In order to calculate the radiological consequences associated with a potential accident (e.g. fires, explosion, spills, etc.), it is necessary to determine the fraction of the source term initially generated by the accident that escapes from the confinement structure to the environment. While it would be desirable to estimate the potential bypass fraction using sophisticated control-volume/flow path computer codes (e.g. CONTAIN, MELCOR, etc.) in order to take as much credit as possible for the mitigative effects of the confinement structure, there are many instances where using such codes is not tractable due to limits on the level-of-effort allotted to perform the analysis. Moreover, the current review environment, with its emphasis on deterministic/bounding-versus probabilistic/best-estimate-analysis discourages using analytical techniques that require the consideration of a large number of parameters. Discussed herein is a simplified control-volume/flow path approach for calculating source term bypass fraction that is amenable to solution in a spreadsheet or with a commercial mathematical solver (e.g. MathCad or Mathematica). It considers the effects of wind and fire pressure gradients on the structure, ventilation system operation, and Halon discharges. Simple models are used to characterize the engineered and non-engineered flow paths. By making judicious choices for the limited set of problem parameters, the results from this approach can be defended as bounding and conservative

  10. An assessment of the acute dietary exposure to glyphosate using deterministic and probabilistic methods.

    Science.gov (United States)

    Stephenson, C L; Harris, C A; Clarke, R

    2018-02-01

    Use of glyphosate in crop production can lead to residues of the active substance and related metabolites in food. Glyphosate has never been considered acutely toxic; however, in 2015 the European Food Safety Authority (EFSA) proposed an acute reference dose (ARfD). This differs from the Joint FAO/WHO Meeting on Pesticide Residues (JMPR) who in 2016, in line with their existing position, concluded that an ARfD was not necessary for glyphosate. This paper makes a comprehensive assessment of short-term dietary exposure to glyphosate from potentially treated crops grown in the EU and imported third-country food sources. European Union and global deterministic models were used to make estimates of short-term dietary exposure (generally defined as up to 24 h). Estimates were refined using food-processing information, residues monitoring data, national dietary exposure models, and basic probabilistic approaches to estimating dietary exposure. Calculated exposures levels were compared to the ARfD, considered to be the amount of a substance that can be consumed in a single meal, or 24-h period, without appreciable health risk. Acute dietary intakes were Probabilistic exposure estimates showed that the acute intake on no person-days exceeded 10% of the ARfD, even for the pessimistic scenario.

  11. A simplified approach to evaluating severe accident source term for PWR

    International Nuclear Information System (INIS)

    Huang, Gaofeng; Tong, Lili; Cao, Xuewu

    2014-01-01

    Highlights: • Traditional source term evaluation approaches have been studied. • A simplified approach of source term evaluation for 600 MW PWR is studied. • Five release categories are established. - Abstract: For early design of NPPs, no specific severe accident source term evaluation was considered. Some general source terms have been used for some NPPs. In order to implement a best estimate, a special source term evaluation should be implemented for an NPP. Traditional source term evaluation approaches (mechanism approach and parametric approach) have some difficulties associated with their implementation. The traditional approaches are not consistent with cost-benefit assessment. A simplified approach for evaluating severe accident source term for PWR is studied. For the simplified approach, a simplified containment event tree is established. According to representative cases selection, weighted coefficient evaluation, computation of representative source term cases and weighted computation, five containment release categories are established, including containment bypass, containment isolation failure, containment early failure, containment late failure and intact containment

  12. Applications of probabilistic techniques at NRC

    International Nuclear Information System (INIS)

    Thadani, A.; Rowsome, F.; Speis, T.

    1984-01-01

    The NRC is currently making extensive use of probabilistic safety assessment in the reactor regulation. Most of these applications have been introduced in the regulatory activities in the past few years. Plant Probabilistic Safety Studies are being utilized as a design tool for applications for standard designs and for assessment of plants located in regions of particularly high population density. There is considerable motivation for licenses to perform plant-specific probabilistic studies for many, if not all, of the existing operating nuclear power plants as a tool for prioritizing the implementation of the many outstanding licensing actions of these plants as well as recommending the elimination of a number of these issues which are judged to be insignificant in terms of their contribution to safety and risk. Risk assessment perspectives are being used in the priorization of generic safety issues, development of technical resolution of unresolved safety issues, assessing safety significance of proposed new regulatory requirements, assessment of safety significance of some of the occurrences at operating facilities and in environmental impact analyses of license applicants as required by the National Environmental Policy Act. (orig.)

  13. Radiological and chemical source terms for Solid Waste Operations Complex

    International Nuclear Information System (INIS)

    Boothe, G.F.

    1994-01-01

    The purpose of this document is to describe the radiological and chemical source terms for the major projects of the Solid Waste Operations Complex (SWOC), including Project W-112, Project W-133 and Project W-100 (WRAP 2A). For purposes of this document, the term ''source term'' means the design basis inventory. All of the SWOC source terms involve the estimation of the radiological and chemical contents of various waste packages from different waste streams, and the inventories of these packages within facilities or within a scope of operations. The composition of some of the waste is not known precisely; consequently, conservative assumptions were made to ensure that the source term represents a bounding case (i.e., it is expected that the source term would not be exceeded). As better information is obtained on the radiological and chemical contents of waste packages and more accurate facility specific models are developed, this document should be revised as appropriate. Radiological source terms are needed to perform shielding and external dose calculations, to estimate routine airborne releases, to perform release calculations and dose estimates for safety documentation, to calculate the maximum possible fire loss and specific source terms for individual fire areas, etc. Chemical source terms (i.e., inventories of combustible, flammable, explosive or hazardous chemicals) are used to determine combustible loading, fire protection requirements, personnel exposures to hazardous chemicals from routine and accident conditions, and a wide variety of other safety and environmental requirements

  14. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  15. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  16. A robust probabilistic approach for variational inversion in shallow water acoustic tomography

    International Nuclear Information System (INIS)

    Berrada, M; Badran, F; Crépon, M; Thiria, S; Hermand, J-P

    2009-01-01

    This paper presents a variational methodology for inverting shallow water acoustic tomography (SWAT) measurements. The aim is to determine the vertical profile of the speed of sound c(z), knowing the acoustic pressures generated by a frequency source and collected by a sparse vertical hydrophone array (VRA). A variational approach that minimizes a cost function measuring the distance between observations and their modeled equivalents is used. A regularization term in the form of a quadratic restoring term to a background is also added. To avoid inverting the variance–covariance matrix associated with the above-weighted quadratic background, this work proposes to model the sound speed vector using probabilistic principal component analysis (PPCA). The PPCA introduces an optimum reduced number of non-correlated latent variables η, which determine a new control vector and a new regularization term, expressed as η T η. The PPCA represents a rigorous formalism for the use of a priori information and allows an efficient implementation of the variational inverse method

  17. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  18. Probabilistic seismic hazards: Guidelines and constraints in evaluating results

    International Nuclear Information System (INIS)

    Sadigh, R.K.; Power, M.S.

    1989-01-01

    In conducting probabilistic seismic hazard analyses, consideration of the dispersion as well as the upper bounds on ground motion is of great significance. In particular, the truncation of ground motion levels at some upper limit would have a major influence on the computed hazard at the low-to-very-low probability levels. Additionally, other deterministic guidelines and constraints should be considered in evaluating the probabilistic seismic hazard results. In contrast to probabilistic seismic hazard evaluations, mean plus one standard deviation ground motions are typically used for deterministic estimates of ground motions from maximum events that may affect a structure. To be consistent with standard deterministic maximum estimates of ground motions values should be the highest level considered for the site. These maximum values should be associated with the largest possible event occurring at the site. Furthermore, the relationships between the ground motion level and probability of exceedance should reflect a transition from purely probabilistic assessments of ground motion at high probability levels where there are multiple chances for events to a deterministic upper bound ground motion at very low probability levels where there is very limited opportunity for maximum events to occur. In Interplate Regions, where the seismic sources may be characterized by a high-to-very-high rate of activity, the deterministic bounds will be approached or exceeded by the computer probabilistic hazard values at annual probability of exceedance levels typically as high as 10 -2 to 10 -3 . Thus, at these or lower values probability levels, probabilistically computed hazard values could be readily interpreted in the light of the deterministic constraints

  19. Effect of source term composition on offsite doses

    International Nuclear Information System (INIS)

    Karahalios, P.; Gardner, R.

    1985-01-01

    The development of new realistic accident source terms has identified the need to establish a basis for comparing the impact of such source terms. This paper attempts to develop a generalized basis of comparison by investigating contributions to offsite acute whole body doses from each group of radionuclides being released to the atmosphere, using CRAC2. The paper also investigates the effect of important parameters such as regional meteorology, sheltering, and duration of release. Finally, the paper focuses on significant changes in the relative importance of individual radionuclide groups in PWR2, SST1, and a revision of the Stone and Webster proposed interim source term

  20. Design parameters and source terms: Volume 3, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan /endash/ Conceptual Design Report, SCP-CDR. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible salt repository sites

  1. Leveraging stochastic differential equations for probabilistic forecasting of wind power using a dynamic power curve

    DEFF Research Database (Denmark)

    Iversen, Jan Emil Banning; Morales González, Juan Miguel; Møller, Jan Kloppenborg

    2017-01-01

    Short-term (hours to days) probabilistic forecasts of wind power generation provide useful information about the associated uncertainty of these forecasts. Standard probabilistic forecasts are usually issued on a per-horizon-basis, meaning that they lack information about the development of the u...

  2. Probability distribution functions of δ15N and δ18O in groundwater nitrate to probabilistically solve complex mixing scenarios

    Science.gov (United States)

    Chrystal, A.; Heikoop, J. M.; Davis, P.; Syme, J.; Hagerty, S.; Perkins, G.; Larson, T. E.; Longmire, P.; Fessenden, J. E.

    2010-12-01

    Elevated nitrate (NO3-) concentrations in drinking water pose a health risk to the public. The dual stable isotopic signatures of δ15N and δ18O in NO3- in surface- and groundwater are often used to identify and distinguish among sources of NO3- (e.g., sewage, fertilizer, atmospheric deposition). In oxic groundwaters where no denitrification is occurring, direct calculations of mixing fractions using a mass balance approach can be performed if three or fewer sources of NO3- are present, and if the stable isotope ratios of the source terms are defined. There are several limitations to this approach. First, direct calculations of mixing fractions are not possible when four or more NO3- sources may be present. Simple mixing calculations also rely upon treating source isotopic compositions as a single value; however these sources themselves exhibit ranges in stable isotope ratios. More information can be gained by using a probabilistic approach to account for the range and distribution of stable isotope ratios in each source. Fitting probability density functions (PDFs) to the isotopic compositions for each source term reveals that some values within a given isotopic range are more likely to occur than others. We compiled a data set of dual isotopes in NO3- sources by combining our measurements with data collected through extensive literature review. We fit each source term with a PDF, and show a new method to probabilistically solve multiple component mixing scenarios with source isotopic composition uncertainty. This method is based on a modified use of a tri-linear diagram. First, source term PDFs are sampled numerous times using a variation of stratified random sampling, Latin Hypercube Sampling. For each set of sampled source isotopic compositions, a reference point is generated close to the measured groundwater sample isotopic composition. This point is used as a vertex to form all possible triangles between all pairs of sampled source isotopic compositions

  3. A probabilistic approach for the estimation of earthquake source parameters from spectral inversion

    Science.gov (United States)

    Supino, M.; Festa, G.; Zollo, A.

    2017-12-01

    The amplitude spectrum of a seismic signal related to an earthquake source carries information about the size of the rupture, moment, stress and energy release. Furthermore, it can be used to characterize the Green's function of the medium crossed by the seismic waves. We describe the earthquake amplitude spectrum assuming a generalized Brune's (1970) source model, and direct P- and S-waves propagating in a layered velocity model, characterized by a frequency-independent Q attenuation factor. The observed displacement spectrum depends indeed on three source parameters, the seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the fault length) and the high-frequency decay parameter. These parameters are strongly correlated each other and with the quality factor Q; a rigorous estimation of the associated uncertainties and parameter resolution is thus needed to obtain reliable estimations.In this work, the uncertainties are characterized adopting a probabilistic approach for the parameter estimation. Assuming an L2-norm based misfit function, we perform a global exploration of the parameter space to find the absolute minimum of the cost function and then we explore the cost-function associated joint a-posteriori probability density function around such a minimum, to extract the correlation matrix of the parameters. The global exploration relies on building a Markov chain in the parameter space and on combining a deterministic minimization with a random exploration of the space (basin-hopping technique). The joint pdf is built from the misfit function using the maximum likelihood principle and assuming a Gaussian-like distribution of the parameters. It is then computed on a grid centered at the global minimum of the cost-function. The numerical integration of the pdf finally provides mean, variance and correlation matrix associated with the set of best-fit parameters describing the model. Synthetic tests are performed to

  4. Towards port sustainability through probabilistic models: Bayesian networks

    Directory of Open Access Journals (Sweden)

    B. Molina

    2018-04-01

    Full Text Available It is necessary that a manager of an infrastructure knows relations between variables. Using Bayesian networks, variables can be classified, predicted and diagnosed, being able to estimate posterior probability of the unknown ones based on known ones. The proposed methodology has generated a database with port variables, which have been classified as economic, social, environmental and institutional, as addressed in of smart ports studies made in all Spanish Port System. Network has been developed using an acyclic directed graph, which have let us know relationships in terms of parents and sons. In probabilistic terms, it can be concluded from the constructed network that the most decisive variables for port sustainability are those that are part of the institutional dimension. It has been concluded that Bayesian networks allow modeling uncertainty probabilistically even when the number of variables is high as it occurs in port planning and exploitation.

  5. The Contrast Effect in Temporal and Probabilistic Discounting

    Science.gov (United States)

    Chen, Cheng; He, Guibing

    2016-01-01

    In this information age, messages related to time, and uncertainty surround us. At the same time, our daily lives are filled with decisions accompanied by temporal delay or uncertainty. Will such information influence our temporal and probabilistic discounting? The authors address this question from the perspectives of decision by sampling (DbS) theory and psychological distance theory. Studies 1 and 2 investigated the effect of contextual messages on temporal discounting and probabilistic discounting, respectively. The results indicated that participants who memorized messages about long-term and low-probability events rated delay or uncertainty as mentally closer and exhibited a less degree of value discounting than those who memorized messages regarding short-term and high-probability events. In addition, a sense of distance from present or reality mediated the effect of contextual messages on value discounting. The implications of the current findings for theory and applications are discussed. PMID:27014122

  6. A Geometric Presentation of Probabilistic Satisfiability

    OpenAIRE

    Morales-Luna, Guillermo

    2010-01-01

    By considering probability distributions over the set of assignments the expected truth values assignment to propositional variables are extended through linear operators, and the expected truth values of the clauses at any given conjunctive form are also extended through linear maps. The probabilistic satisfiability problems are discussed in terms of the introduced linear extensions. The case of multiple truth values is also discussed.

  7. Bayesian source term determination with unknown covariance of measurements

    Science.gov (United States)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  8. A comparison of world-wide uses of severe reactor accident source terms

    International Nuclear Information System (INIS)

    Ang, M.L.; Frid, W.; Kersting, E.J.; Friederichs, H.G.; Lee, R.Y.; Meyer-Heine, A.; Powers, D.A.; Soda, K.; Sweet, D.

    1994-09-01

    The definitions of source terms to reactor containments and source terms to the environment are discussed. A comparison is made between the TID-14844 example source term and the alternative source term described in NUREG-1465. Comparisons of these source terms to the containments and those used in France, Germany, Japan, Sweden, and the United Kingdom are made. Source terms to the environment calculated in NUREG-1500 and WASH-1400 are discussed. Again, these source terms are compared to those now being used in France, Germany, Japan, Sweden, and the United Kingdom. It is concluded that source terms to the containment suggested in NUREG-1465 are not greatly more conservative than those used in other countries. Technical bases for the source terms are similar. The regulatory use of the current understanding of radionuclide behavior varies among countries

  9. Lessons Learned from Characterization, Performance Assessment, and EPA Regulatory Review of the 1996 Actinide Source Term for the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    Larson, K.W.; Moore, R.C.; Nowak, E.J.; Papenguth, H.W.; Jow, H.

    1999-01-01

    The Waste Isolation Pilot Plant (WIPP) is a US Department of Energy (DOE) facility for the permanent disposal of transuranic waste from defense activities. In 1996, the DOE submitted the Title 40 CFR Part 191 Compliance Certification Application for the Waste Isolation Pilot Plant (CCA) to the US Environmental Protection Agency (EPA). The CCA included a probabilistic performance assessment (PA) conducted by Sandia National Laboratories to establish compliance with the quantitative release limits defined in 40 CFR 191.13. An experimental program to collect data relevant to the actinide source term began around 1989, which eventually supported the 1996 CCA PA actinide source term model. The actinide source term provided an estimate of mobile dissolved and colloidal Pu, Am, U, Th, and Np concentrations in their stable oxidation states, and accounted for effects of uncertainty in the chemistry of brines in waste disposal areas. The experimental program and the actinide source term included in the CCA PA underwent EPA review lasting more than 1 year. Experiments were initially conducted to develop data relevant to the wide range of potential future conditions in waste disposal areas. Interim, preliminary performance assessments and actinide source term models provided insight allowing refinement of experiments and models. Expert peer review provided additional feedback and confidence in the evolving experimental program. By 1995, the chemical database and PA predictions of WIPP performance were considered reliable enough to support the decision to add an MgO backfill to waste rooms to control chemical conditions and reduce uncertainty in actinide concentrations, especially for Pu and Am. Important lessons learned through the characterization, PA modeling, and regulatory review of the actinide source term are (1) experimental characterization and PA should evolve together, with neither activity completely dominating the other, (2) the understanding of physical processes

  10. Quantification of severe accident source terms of a Westinghouse 3-loop plant

    International Nuclear Information System (INIS)

    Lee Min; Ko, Y.-C.

    2008-01-01

    Integrated severe accident analysis codes are used to quantify the source terms of the representative sequences identified in PSA study. The characteristics of these source terms depend on the detail design of the plant and the accident scenario. A historical perspective of radioactive source term is provided. The grouping of radionuclides in different source terms or source term quantification tools based on TID-14844, NUREG-1465, and WASH-1400 is compared. The radionuclides release phenomena and models adopted in the integrated severe accident analysis codes of STCP and MAAP4 are described. In the present study, the severe accident source terms for risk quantification of Maanshan Nuclear Power Plant of Taiwan Power Company are quantified using MAAP 4.0.4 code. A methodology is developed to quantify the source terms of each source term category (STC) identified in the Level II PSA analysis of the plant. The characteristics of source terms obtained are compared with other source terms. The plant analyzed employs a Westinghouse designed 3-loop pressurized water reactor (PWR) with large dry containment

  11. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  12. Probabilistic Assessment of Severe Accident Consequence in West Bangka

    Science.gov (United States)

    Sunarko; Su'ud, Zaki

    2017-07-01

    Probabilistic dose assessment for severe accident condition is performed for West Bangka area. Source-term from WASH-1400 reactor analysis is used as a conservative release scenario for 1000 MWe PWR. Seven groups of isotopes are used in the simulation based on core inventory and release fraction. Population distribution for Muntok district and the area within a 100 km radius is obtained from 2014 data. Meteorological data is provided through cyclic sampling from a database containing two-year site-specific hourly records in 2014-2015 periods. PC-COSYMA segmented plume dispersion code is used to investigate the assumed the consequence of the accident scenario. The result indicates that early or deterministic effect is important for areas close the release point while long-term or stochastic effect is related to population distribution and covers area of up to 100 km from the release point. The mean annual expected values for early mortality and late mortality for the population within 100 km radius from Muntok site are 2.38×10-4 yr -1 and 1.33×10-3 yr -1 respectively.

  13. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  14. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  15. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  16. Probabilistic Accident Progression Analysis with application to a LMFBR design

    International Nuclear Information System (INIS)

    Jamali, K.M.

    1982-01-01

    A method for probabilistic analysis of accident sequences in nuclear power plant systems referred to as ''Probabilistic Accident Progression Analysis'' (PAPA) is described. Distinctive features of PAPA include: (1) definition and analysis of initiator-dependent accident sequences on the component level; (2) a new fault-tree simplification technique; (3) a new technique for assessment of the effect of uncertainties in the failure probabilities in the probabilistic ranking of accident sequences; (4) techniques for quantification of dependent failures of similar components, including an iterative technique for high-population components. The methodology is applied to the Shutdown Heat Removal System (SHRS) of the Clinch River Breeder Reactor Plant during its short-term (0 -2 . Major contributors to this probability are the initiators loss of main feedwater system, loss of offsite power, and normal shutdown

  17. A Bayesian Method for Short-Term Probabilistic Forecasting of Photovoltaic Generation in Smart Grid Operation and Control

    Directory of Open Access Journals (Sweden)

    Gabriella Ferruzzi

    2013-02-01

    Full Text Available A new short-term probabilistic forecasting method is proposed to predict the probability density function of the hourly active power generated by a photovoltaic system. Firstly, the probability density function of the hourly clearness index is forecasted making use of a Bayesian auto regressive time series model; the model takes into account the dependence of the solar radiation on some meteorological variables, such as the cloud cover and humidity. Then, a Monte Carlo simulation procedure is used to evaluate the predictive probability density function of the hourly active power by applying the photovoltaic system model to the random sampling of the clearness index distribution. A numerical application demonstrates the effectiveness and advantages of the proposed forecasting method.

  18. Probabilistic eruption forecasting at short and long time scales

    Science.gov (United States)

    Marzocchi, Warner; Bebbington, Mark S.

    2012-10-01

    Any effective volcanic risk mitigation strategy requires a scientific assessment of the future evolution of a volcanic system and its eruptive behavior. Some consider the onus should be on volcanologists to provide simple but emphatic deterministic forecasts. This traditional way of thinking, however, does not deal with the implications of inherent uncertainties, both aleatoric and epistemic, that are inevitably present in observations, monitoring data, and interpretation of any natural system. In contrast to deterministic predictions, probabilistic eruption forecasting attempts to quantify these inherent uncertainties utilizing all available information to the extent that it can be relied upon and is informative. As with many other natural hazards, probabilistic eruption forecasting is becoming established as the primary scientific basis for planning rational risk mitigation actions: at short-term (hours to weeks or months), it allows decision-makers to prioritize actions in a crisis; and at long-term (years to decades), it is the basic component for land use and emergency planning. Probabilistic eruption forecasting consists of estimating the probability of an eruption event and where it sits in a complex multidimensional time-space-magnitude framework. In this review, we discuss the key developments and features of models that have been used to address the problem.

  19. The Dependency of Probabilistic Tsunami Hazard Assessment on Magnitude Limits of Seismic Sources in the South China Sea and Adjoining Basins

    Science.gov (United States)

    Li, Hongwei; Yuan, Ye; Xu, Zhiguo; Wang, Zongchen; Wang, Juncheng; Wang, Peitao; Gao, Yi; Hou, Jingming; Shan, Di

    2017-06-01

    The South China Sea (SCS) and its adjacent small basins including Sulu Sea and Celebes Sea are commonly identified as tsunami-prone region by its historical records on seismicity and tsunamis. However, quantification of tsunami hazard in the SCS region remained an intractable issue due to highly complex tectonic setting and multiple seismic sources within and surrounding this area. Probabilistic Tsunami Hazard Assessment (PTHA) is performed in the present study to evaluate tsunami hazard in the SCS region based on a brief review on seismological and tsunami records. 5 regional and local potential tsunami sources are tentatively identified, and earthquake catalogs are generated using Monte Carlo simulation following the Tapered Gutenberg-Richter relationship for each zone. Considering a lack of consensus on magnitude upper bound on each seismic source, as well as its critical role in PTHA, the major concern of the present study is to define the upper and lower limits of tsunami hazard in the SCS region comprehensively by adopting different corner magnitudes that could be derived by multiple principles and approaches, including TGR regression of historical catalog, fault-length scaling, tectonic and seismic moment balance, and repetition of historical largest event. The results show that tsunami hazard in the SCS and adjoining basins is subject to large variations when adopting different corner magnitudes, with the upper bounds 2-6 times of the lower. The probabilistic tsunami hazard maps for specified return periods reveal much higher threat from Cotabato Trench and Sulawesi Trench in the Celebes Sea, whereas tsunami hazard received by the coasts of the SCS and Sulu Sea is relatively moderate, yet non-negligible. By combining empirical method with numerical study of historical tsunami events, the present PTHA results are tentatively validated. The correspondence lends confidence to our study. Considering the proximity of major sources to population-laden cities

  20. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    Science.gov (United States)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  1. Comparison between Canadian probabilistic safety assessment methods formulated by Atomic Energy of Canada limited and probabilistic risk assessment methods

    International Nuclear Information System (INIS)

    Shapiro, H.S.; Smith, J.E.

    1989-01-01

    The procedures used by Atomic Energy of Canada Limited (AECL) to perform probabilistic safety assessments (PRAs) differ somewhat from conventionally accepted probabilistic risk assessment (PRA) procedures used elsewhere. In Canada, PSA is used by AECL as an audit tool for an evolving design. The purpose is to assess the safety of the plant in engineering terms. Thus, the PSA procedures are geared toward providing engineering feedback so that necessary changes can be made to the design at an early stage, input can be made to operating procedures, and test and maintenance programs can be optimized in terms of costs. Most PRAs, by contrast, are performed in plants that are already built. Their main purpose is to establish the core melt frequency and the risk to the public due to core melt. Also, any design modification is very expensive. The differences in purpose and timing between PSA and PRA have resulted in differences in methodology and scope. The PSA procedures are used on all plants being designed by AECL

  2. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    Energy Technology Data Exchange (ETDEWEB)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.; Keisler, J.M.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozone are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.

  3. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  4. Estimation of Source Term Behaviors in SBO Sequence in a Typical 1000MWth PWR and Comparison with Other Source Term Results

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Han, Seok Jung; Ahn, Kwang Il; Fynan, Douglas; Jung, Yong Hoon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Since the Three Mile Island (TMI) (1979), Chernobyl (1986), Fukushima Daiichi (2011) accidents, the assessment of radiological source term effects on the environment has been a key concern of nuclear safety. In the Fukushima Daiichi accident, the long-term SBO (station blackout) accident occurs. Using the worst case assumptions like in Fukushima accident on the accident sequences and on the availability of safety systems, the thermal hydraulic behaviors, core relocation and environmental source terms behaviors are estimated for long-term SBO accident for OPR-1000 reactor. MELCOR code version 1.8.6 is used in this analysis. Source term results estimated in this study is compared with other previous studies and estimated results in Fukushima accidents in UNSCEAR-2013 report. This study estimated that 11 % of iodine can be released to environment and 2% of cesium can be released to environment. UNSCEAR-2013 report estimated that 2 - 8 % of iodine have been released to environment and 1 - 3 % of cesium have been released to the environment. They have similar results in the aspect of release fractions of iodine and cesium to environment.

  5. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  6. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  7. Probabilistic Modeling of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei

    Wind energy is one of several energy sources in the world and a rapidly growing industry in the energy sector. When placed in offshore or onshore locations, wind turbines are exposed to wave excitations, highly dynamic wind loads and/or the wakes from other wind turbines. Therefore, most components...... in a wind turbine experience highly dynamic and time-varying loads. These components may fail due to wear or fatigue, and this can lead to unplanned shutdown repairs that are very costly. The design by deterministic methods using safety factors is generally unable to account for the many uncertainties. Thus......, a reliability assessment should be based on probabilistic methods where stochastic modeling of failures is performed. This thesis focuses on probabilistic models and the stochastic modeling of the fatigue life of the wind turbine drivetrain. Hence, two approaches are considered for stochastic modeling...

  8. Selection of models to calculate the LLW source term

    International Nuclear Information System (INIS)

    Sullivan, T.M.

    1991-10-01

    Performance assessment of a LLW disposal facility begins with an estimation of the rate at which radionuclides migrate out of the facility (i.e., the source term). The focus of this work is to develop a methodology for calculating the source term. In general, the source term is influenced by the radionuclide inventory, the wasteforms and containers used to dispose of the inventory, and the physical processes that lead to release from the facility (fluid flow, container degradation, wasteform leaching, and radionuclide transport). In turn, many of these physical processes are influenced by the design of the disposal facility (e.g., infiltration of water). The complexity of the problem and the absence of appropriate data prevent development of an entirely mechanistic representation of radionuclide release from a disposal facility. Typically, a number of assumptions, based on knowledge of the disposal system, are used to simplify the problem. This document provides a brief overview of disposal practices and reviews existing source term models as background for selecting appropriate models for estimating the source term. The selection rationale and the mathematical details of the models are presented. Finally, guidance is presented for combining the inventory data with appropriate mechanisms describing release from the disposal facility. 44 refs., 6 figs., 1 tab

  9. Review of SFR In-Vessel Radiological Source Term Studies

    International Nuclear Information System (INIS)

    Suk, Soo Dong; Lee, Yong Bum

    2008-10-01

    An effort has been made in this study to search for and review the literatures in public domain on the studies of the phenomena related to the release of radionuclides and aerosols to the reactor containment of the sodium fast reactor (SFR) plants (i.e., in-vessel source term), made in Japan and Europe including France, Germany and UK over the last few decades. Review work is focused on the experimental programs to investigate the phenomena related to determining the source terms, with a brief review on supporting analytical models and computer programs. In this report, the research programs conducted to investigate the CDA (core disruptive accident) bubble behavior in the sodium pool for determining 'primary' or 'instantaneous' source term are first introduced. The studies performed to determine 'delayed source term' are then described, including the various stages of phenomena and processes: fission product (FP) release from fuel , evaporation release from the surface of the pool, iodine mass transfer from fission gas bubble, FP deposition , and aerosol release from core-concrete interaction. The research programs to investigate the release and transport of FPs and aerosols in the reactor containment (i.e., in-containment source term) are not described in this report

  10. Utility view of the source term and air cleaning

    International Nuclear Information System (INIS)

    Littlefield, P.S.

    1985-01-01

    The utility view of the source term and air cleaning is discussed. The source term is made up of: (1) noble gases, which there has been a tendency to ignore in the past because it was thought there was nothing that could be done with them anyway, (2) the halogens, which have been dealt with in Air Cleaning Conferences in the past in terms of charcoal and other systems for removing them, and (3) the solid components of the source term which particulate filters are designed to handle. Air cleaning systems consist of filters, adsorbers, containment sprays, suppression pools in boiling water reactors and ice beds in ice condenser-equipped plants. The feasibility and cost of air cleaning systems are discussed

  11. Levels, sources and probabilistic health risks of polycyclic aromatic hydrocarbons in the agricultural soils from sites neighboring suburban industries in Shanghai.

    Science.gov (United States)

    Tong, Ruipeng; Yang, Xiaoyi; Su, Hanrui; Pan, Yue; Zhang, Qiuzhuo; Wang, Juan; Long, Mingce

    2018-03-01

    The levels, sources and quantitative probabilistic health risks for polycyclic aromatic hydrocarbons (PAHs) in agricultural soils in the vicinity of power, steel and petrochemical plants in the suburbs of Shanghai are discussed. The total concentration of 16 PAHs in the soils ranges from 223 to 8214ng g -1 . The sources of PAHs were analyzed by both isomeric ratios and a principal component analysis-multiple linear regression method. The results indicate that PAHs mainly originated from the incomplete combustion of coal and oil. The probabilistic risk assessments for both carcinogenic and non-carcinogenic risks posed by PAHs in soils with adult farmers as concerned receptors were quantitatively calculated by Monte Carlo simulation. The estimated total carcinogenic risks (TCR) for the agricultural soils has a 45% possibility of exceeding the acceptable threshold value (10 -6 ), indicating potential adverse health effects. However, all non-carcinogenic risks are below the threshold value. Oral intake is the dominant exposure pathway, accounting for 77.7% of TCR, while inhalation intake is negligible. The three PAHs with the highest contribution for TCR are BaP (64.35%), DBA (17.56%) and InP (9.06%). Sensitivity analyses indicate that exposure frequency has the greatest impact on the total risk uncertainty, followed by the exposure dose through oral intake and exposure duration. These results indicate that it is essential to manage the health risks of PAH-contaminated agricultural soils in the vicinity of typical industries in megacities. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Iodine chemistry effect on source term assessments. A MELCOR 186 YT study of a PWR severe accident sequence

    International Nuclear Information System (INIS)

    Herranz, Luis E.; Garcia, Monica; Otero, Bernadette

    2009-01-01

    Level-2 Probabilistic Safety Analysis has demonstrated to be a powerful tool to give insights into multiple aspects concerning severe accidents: phenomena with the greatest potential to lead to containment failure, safety systems performance and, even, to identify any additional accident management that could mitigate the consequences of such an even, etc. A major result of level-2 PSA is iodine content in Source Term since it is the main responsible for the radiological impact during the first few days after a hypothetical severe accident. Iodine chemistry is known to considerably affect iodine behavior and although understanding has improved substantially since the early 90's, a thorough understanding is still missing and most PSA studies do not address it when assessing severe accident scenarios. This paper emphasizes the quantitative and qualitative significance of considering iodine chemistry in level-2 PSA estimates. To do so a cold leg break, low pressure severe accident sequence of an actual pressurized water reactor has been analyzed with the MELCOR 1.8.6 YT code. Two sets of calculations, with and without chemistry, have been carried out and compared. The study shows that iodine chemistry could result in an iodine release to environment about twice higher, most of which would consist of around 60% of iodine in gaseous form. From these results it is concluded that exploratory studies on the potential effect of iodine chemistry on source term estimates should be carried out. (author)

  13. Probabilistic maps of the white matter tracts with known associated functions on the neonatal brain atlas: Application to evaluate longitudinal developmental trajectories in term-born and preterm-born infants.

    Science.gov (United States)

    Akazawa, Kentaro; Chang, Linda; Yamakawa, Robyn; Hayama, Sara; Buchthal, Steven; Alicata, Daniel; Andres, Tamara; Castillo, Deborrah; Oishi, Kumiko; Skranes, Jon; Ernst, Thomas; Oishi, Kenichi

    2016-03-01

    Diffusion tensor imaging (DTI) has been widely used to investigate the development of the neonatal and infant brain, and deviations related to various diseases or medical conditions like preterm birth. In this study, we created a probabilistic map of fiber pathways with known associated functions, on a published neonatal multimodal atlas. The pathways-of-interest include the superficial white matter (SWM) fibers just beneath the specific cytoarchitectonically defined cortical areas, which were difficult to evaluate with existing DTI analysis methods. The Jülich cytoarchitectonic atlas was applied to define cortical areas related to specific brain functions, and the Dynamic Programming (DP) method was applied to delineate the white matter pathways traversing through the SWM. Probabilistic maps were created for pathways related to motor, somatosensory, auditory, visual, and limbic functions, as well as major white matter tracts, such as the corpus callosum, the inferior fronto-occipital fasciculus, and the middle cerebellar peduncle, by delineating these structures in eleven healthy term-born neonates. In order to characterize maturation-related changes in diffusivity measures of these pathways, the probabilistic maps were then applied to DTIs of 49 healthy infants who were longitudinally scanned at three time-points, approximately five weeks apart. First, we investigated the normal developmental pattern based on 19 term-born infants. Next, we analyzed 30 preterm-born infants to identify developmental patterns related to preterm birth. Last, we investigated the difference in diffusion measures between these groups to evaluate the effects of preterm birth on the development of these functional pathways. Term-born and preterm-born infants both demonstrated a time-dependent decrease in diffusivity, indicating postnatal maturation in these pathways, with laterality seen in the corticospinal tract and the optic radiation. The comparison between term- and preterm

  14. Low-level radioactive waste performance assessments: Source term modeling

    International Nuclear Information System (INIS)

    Icenhour, A.S.; Godbee, H.W.; Miller, L.F.

    1995-01-01

    Low-level radioactive wastes (LLW) generated by government and commercial operations need to be isolated from the environment for at least 300 to 500 yr. Most existing sites for the storage or disposal of LLW employ the shallow-land burial approach. However, the U.S. Department of Energy currently emphasizes the use of engineered systems (e.g., packaging, concrete and metal barriers, and water collection systems). Future commercial LLW disposal sites may include such systems to mitigate radionuclide transport through the biosphere. Performance assessments must be conducted for LUW disposal facilities. These studies include comprehensive evaluations of radionuclide migration from the waste package, through the vadose zone, and within the water table. Atmospheric transport mechanisms are also studied. Figure I illustrates the performance assessment process. Estimates of the release of radionuclides from the waste packages (i.e., source terms) are used for subsequent hydrogeologic calculations required by a performance assessment. Computer models are typically used to describe the complex interactions of water with LLW and to determine the transport of radionuclides. Several commonly used computer programs for evaluating source terms include GWSCREEN, BLT (Breach-Leach-Transport), DUST (Disposal Unit Source Term), BARRIER (Ref. 5), as well as SOURCE1 and SOURCE2 (which are used in this study). The SOURCE1 and SOURCE2 codes were prepared by Rogers and Associates Engineering Corporation for the Oak Ridge National Laboratory (ORNL). SOURCE1 is designed for tumulus-type facilities, and SOURCE2 is tailored for silo, well-in-silo, and trench-type disposal facilities. This paper focuses on the source term for ORNL disposal facilities, and it describes improved computational methods for determining radionuclide transport from waste packages

  15. Literature study of source term research for PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Sponton, L.L.; NiIsson, Lars

    2001-04-01

    A literature survey has been carried out in support of ongoing source term calculations with the MELCOR code of some severe accident scenarios for the Swedish Ringhals 2 pressurised water reactor (PWR). The research in the field of severe accidents in power reactors and the source term for subsequent release of radioisotopes was intensified after the Harrisburg accident and has produced a large amount of reports and papers. This survey was therefore limited to research concerning PWR type of reactors and with emphasis on papers related to MELCOR code development. A background is given, relating to some historic documents, and then more recent research after 1990 is reviewed. Of special interest is the ongoing PMbus-programme which is creating new and important results of benefit to the code development and validation of, among others, the MELCOR code. It is concluded that source term calculations involve simulation of many interacting complex physical phenomena, which result in large uncertainties The research has, however, over the years led to considerable improvements Thus has the uncertainty in source term predictions been reduced one to two orders of magnitude from the simpler codes in the early 1980-s to the more realistic codes of today, like MELCOR.

  16. Literature study of source term research for PWRs

    International Nuclear Information System (INIS)

    Sponton, L.L.; NiIsson, Lars

    2001-04-01

    A literature survey has been carried out in support of ongoing source term calculations with the MELCOR code of some severe accident scenarios for the Swedish Ringhals 2 pressurised water reactor (PWR). The research in the field of severe accidents in power reactors and the source term for subsequent release of radioisotopes was intensified after the Harrisburg accident and has produced a large amount of reports and papers. This survey was therefore limited to research concerning PWR type of reactors and with emphasis on papers related to MELCOR code development. A background is given, relating to some historic documents, and then more recent research after 1990 is reviewed. Of special interest is the ongoing PMbus-programme which is creating new and important results of benefit to the code development and validation of, among others, the MELCOR code. It is concluded that source term calculations involve simulation of many interacting complex physical phenomena, which result in large uncertainties The research has, however, over the years led to considerable improvements Thus has the uncertainty in source term predictions been reduced one to two orders of magnitude from the simpler codes in the early 1980-s to the more realistic codes of today, like MELCOR

  17. Reassessment of the technical bases for estimating source terms. Final report

    International Nuclear Information System (INIS)

    Silberberg, M.; Mitchell, J.A.; Meyer, R.O.; Ryder, C.P.

    1986-07-01

    This document describes a major advance in the technology for calculating source terms from postulated accidents at US light-water reactors. The improved technology consists of (1) an extensive data base from severe accident research programs initiated following the TMI accident, (2) a set of coupled and integrated computer codes (the Source Term Code Package), which models key aspects of fission product behavior under severe accident conditions, and (3) a number of detailed mechanistic codes that bridge the gap between the data base and the Source Term Code Package. The improved understanding of severe accident phenonmena has also allowed an identification of significant sources of uncertainty, which should be considered in estimating source terms. These sources of uncertainty are also described in this document. The current technology provides a significant improvement in evaluating source terms over that available at the time of the Reactor Safety Study (WASH-1400) and, because of this significance, the Nuclear Regulatory Commission staff is recommending its use

  18. On the effectiveness of surface assimilation in probabilistic nowcasts of planetary boundary layer profiles

    Science.gov (United States)

    Rostkier-Edelstein, Dorita; Hacker, Joshua

    2013-04-01

    Surface observations comprise a wide, non-expensive and reliable source of information about the state of the near-surface planetary boundary layer (PBL). Operational data assimilation systems have encountered several difficulties in effectively assimilating them, among others due to their local-scale representativeness, the transient coupling between the surface and the atmosphere aloft and the balance constraints usually used. A long-term goal of this work is to find an efficient system for probabilistic PBL nowcasting that can be employed wherever surface observations are present. Earlier work showed that surface observations can be an important source of information with a single column model (SCM) and an ensemble filter (EF). Here we extend that work to quantify the probabilistic skill of ensemble SCM predictions with a model including added complexity. We adopt a factor separation analysis to quantify the contribution of surface assimilation relative to that of selected model components (parameterized radiation and externally imposed horizontal advection) to the probabilistic skill of the system, and of any beneficial or detrimental interactions between them. To assess the real utility of the flow-dependent covariances estimated with the EF and of the SCM of the PBL we compare the skill of the SCM/EF system to that of a reference one based on climatological covariances and a 30-min persistence model. It consists of a dressing technique, whereby a deterministic 3D mesoscale forecast (e.g. from WRF model) is adjusted and dressed with uncertainty using a seasonal sample of mesoscale forecasts and surface forecast errors. Results show that assimilation of surface observations can improve deterministic and probabilistic profile predictions more significantly than major model improvements. Flow-dependent covariances estimated with the SCM/EF show clear advantage over the use of climatological covariances when the flow is characterized by wide variability, when

  19. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  20. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  1. Near-source mobile methane emission estimates using EPA Method33a and a novel probabilistic approach as a basis for leak quantification in urban areas

    Science.gov (United States)

    Albertson, J. D.

    2015-12-01

    Methane emissions from underground pipeline leaks remain an ongoing issue in the development of accurate methane emission inventories for the natural gas supply chain. Application of mobile methods during routine street surveys would help address this issue, but there are large uncertainties in current approaches. In this paper, we describe results from a series of near-source (< 30 m) controlled methane releases where an instrumented van was used to measure methane concentrations during both fixed location sampling and during mobile traverses immediately downwind of the source. The measurements were used to evaluate the application of EPA Method 33A for estimating methane emissions downwind of a source and also to test the application of a new probabilistic approach for estimating emission rates from mobile traverse data.

  2. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  3. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  4. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  5. Hazardous constituent source term. Revision 2

    International Nuclear Information System (INIS)

    1994-01-01

    The Department of Energy (DOE) has several facilities that either generate and/or store transuranic (TRU)-waste from weapons program research and production. Much of this waste also contains hazardous waste constituents as regulated under Subtitle C of the Resource Conservation and Recovery Act (RCRA). Toxicity characteristic metals in the waste principally include lead, occurring in leaded rubber gloves and shielding. Other RCRA metals may occur as contaminants in pyrochemical salt, soil, debris, and sludge and solidified liquids, as well as in equipment resulting from decontamination and decommissioning activities. Volatile organic compounds (VOCS) contaminate many waste forms as a residue adsorbed on surfaces or occur in sludge and solidified liquids. Due to the presence of these hazardous constituents, applicable disposal regulations include land disposal restrictions established by Hazardous and Solid Waste Amendments (HSWA). The DOE plans to dispose of TRU-mixed waste from the weapons program in the Waste Isolation Pilot Plant (WIPP) by demonstrating no-migration of hazardous constituents. This paper documents the current technical basis for methodologies proposed to develop a post-closure RCRA hazardous constituent source term. For the purposes of demonstrating no-migration, the hazardous constituent source term is defined as the quantities of hazardous constituents that are available for transport after repository closure. Development of the source term is only one of several activities that will be involved in the no-migration demonstration. The demonstration will also include uncertainty and sensitivity analyses of contaminant transport

  6. Real time source term and dose assessment

    International Nuclear Information System (INIS)

    Breznik, B.; Kovac, A.; Mlakar, P.

    2001-01-01

    The Dose Projection Programme is a tool for decision making in case of nuclear emergency. The essential input data for quick emergency evaluation in the case of hypothetical pressurised water reactor accident are following: source term, core damage assessment, fission product radioactivity, release source term and critical exposure pathways for an early phase of the release. A reduced number of radio-nuclides and simplified calculations can be used in dose calculation algorithm. Simple expert system personal computer programme has been developed for the Krsko Nuclear Power Plant for dose projection within the radius of few kilometers from the pressurised water reactor in early phase of an accident. The input data are instantaneous data of core activity, core damage indicators, release fractions, reduction factor of the release pathways, spray operation, release timing, and dispersion coefficient. Main dose projection steps are: accurate in-core radioactivity determination using reactor power input; core damage and in-containment source term assessment based on quick indications of instrumentation or on activity analysis data; user defines release pathway for typical PWR accident scenarius; dose calculation is performed only for exposure pathway critical for decision about evacuation or sheltering in early phase of an accident.(author)

  7. Prediction Uncertainty and Groundwater Management: Approaches to get the Most out of Probabilistic Outputs

    Science.gov (United States)

    Peeters, L. J.; Mallants, D.; Turnadge, C.

    2017-12-01

    Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a

  8. Source Term Model for Fine Particle Resuspension from Indoor Surfaces

    National Research Council Canada - National Science Library

    Kim, Yoojeong; Gidwani, Ashok; Sippola, Mark; Sohn, Chang W

    2008-01-01

    This Phase I effort developed a source term model for particle resuspension from indoor surfaces to be used as a source term boundary condition for CFD simulation of particle transport and dispersion in a building...

  9. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises

    Science.gov (United States)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner

    2015-04-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  10. A probabilistic tsunami hazard assessment for Indonesia

    Science.gov (United States)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  11. A~probabilistic tsunami hazard assessment for Indonesia

    Science.gov (United States)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  12. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-10-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  13. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-02-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  14. Scoping Analysis of Source Term and Functional Containment Attenuation Factors

    Energy Technology Data Exchange (ETDEWEB)

    Pete Lowry

    2012-01-01

    In order to meet future regulatory requirements, the Next Generation Nuclear Plant (NGNP) Project must fully establish and validate the mechanistic modular high temperature gas-cooled reactor (HTGR) source term. This is not possible at this stage in the project, as significant uncertainties in the final design remain unresolved. In the interim, however, there is a need to establish an approximate characterization of the source term. The NGNP team developed a simplified parametric model to establish mechanistic source term estimates for a set of proposed HTGR configurations.

  15. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  16. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  17. Review of radionuclide source terms used for performance-assessment analyses

    International Nuclear Information System (INIS)

    Barnard, R.W.

    1993-06-01

    Two aspects of the radionuclide source terms used for total-system performance assessment (TSPA) analyses have been reviewed. First, a detailed radionuclide inventory (i.e., one in which the reactor type, decay, and burnup are specified) is compared with the standard source-term inventory used in prior analyses. The latter assumes a fixed ratio of pressurized-water reactor (PWR) to boiling-water reactor (BWR) spent fuel, at specific amounts of burnup and at 10-year decay. TSPA analyses have been used to compare the simplified source term with the detailed one. The TSPA-91 analyses did not show a significant difference between the source terms. Second, the radionuclides used in source terms for TSPA aqueous-transport analyses have been reviewed to select ones that are representative of the entire inventory. It is recommended that two actinide decay chains be included (the 4n+2 ''uranium'' and 4n+3 ''actinium'' decay series), since these include several radionuclides that have potentially important release and dose characteristics. In addition, several fission products are recommended for the same reason. The choice of radionuclides should be influenced by other parameter assumptions, such as the solubility and retardation of the radionuclides

  18. Actinide Source Term Program, position paper. Revision 1

    International Nuclear Information System (INIS)

    Novak, C.F.; Papenguth, H.W.; Crafts, C.C.; Dhooge, N.J.

    1994-01-01

    The Actinide Source Term represents the quantity of actinides that could be mobilized within WIPP brines and could migrate with the brines away from the disposal room vicinity. This document presents the various proposed methods for estimating this source term, with a particular focus on defining these methods and evaluating the defensibility of the models for mobile actinide concentrations. The conclusions reached in this document are: the 92 PA open-quotes expert panelclose quotes model for mobile actinide concentrations is not defensible; and, although it is extremely conservative, the open-quotes inventory limitsclose quotes model is the only existing defensible model for the actinide source term. The model effort in progress, open-quotes chemical modeling of mobile actinide concentrationsclose quotes, supported by a laboratory effort that is also in progress, is designed to provide a reasonable description of the system and be scientifically realistic and supplant the open-quotes Inventory limitsclose quotes model

  19. Probabilistic risk assessment of aircraft impact on a spent nuclear fuel dry storage

    Energy Technology Data Exchange (ETDEWEB)

    Almomani, Belal, E-mail: balmomani@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Sanghoon, E-mail: shlee1222@kmu.ac.kr [Department of Mechanical and Automotive Engineering, Keimyung University, Dalgubeol-daero 1095, Dalseo-gu, Daegu (Korea, Republic of); Jang, Dongchan, E-mail: dongchan.jang@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Kang, Hyun Gook, E-mail: kangh6@rpi.edu [Department of Mechanical, Aerospace and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy, NY 12180 (United States)

    2017-01-15

    Highlights: • A new risk assessment frame is proposed for aircraft impact into an interim dry storage. • It uses event tree analysis, response-structural analysis, consequence analysis, and Monte Carlo simulation. • A case study of the proposed procedure is presented to illustrate the methodology’s application. - Abstract: This paper proposes a systematic risk evaluation framework for one of the most significant impact events on an interim dry storage facility, an aircraft crash, by using a probabilistic approach. A realistic case study that includes a specific cask model and selected impact conditions is performed to demonstrate the practical applicability of the proposed framework. An event tree analysis of an occurred aircraft crash that defines a set of impact conditions and storage cask response is constructed. The Monte-Carlo simulation is employed for the probabilistic approach in consideration of sources of uncertainty associated with the impact loads onto the internal storage casks. The parameters for representing uncertainties that are managed probabilistically include the aircraft impact velocity, the compressive strength of the reinforced concrete wall, the missile shape factor, and the facility wall thickness. Failure probabilities of the impacted wall and a single storage cask under direct mechanical impact load caused by the aircraft crash are estimated. A finite element analysis is applied to simulate the postulated direct engine impact load onto the cask body, and a source term analysis for associated releases of radioactive materials as well as an off-site consequence analysis are performed. Finally, conditional risk contribution calculations are represented by an event tree model. Case study results indicate that no severe risk is presented, as the radiological consequences do not exceed regulatory exposure limits to the public. This risk model can be used with any other representative detailed parameters and reference design concepts for

  20. Development of source term PIRT of Fukushima Daiichi NPPs accident

    International Nuclear Information System (INIS)

    Suehiro, S.; Okamoto, K.

    2017-01-01

    The severe accident evaluation committee of AESJ (Atomic Energy Society of Japan) developed the thermal hydraulic PIRT (Phenomena Identification and Ranking Table) and the source term PIRT based on findings during the Fukushima Daiichi NPPs accident. These PIRTs aimed to explore the debris distribution and the current condition in the NPPs with high accuracy and to extract higher priority from the aspect of the sophistication of the analytical technology to predict the severe accident phenomena by the code. The source term PIRT was divided into 3 phases for the time domain and 9 categories for the spatial domain. The 68 phenomena were extracted and the importance from viewpoint of the source term was ranked through brainstorming and discussion. This paper describes the developed source term PIRT list and summarized the high ranked phenomena in each phase. (author)

  1. The latest results from source term research. Overview and outlook

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, Luis E. [Centro de Investigaciones Energeticas Medio Ambientales y Tecnologica (CIEMAT), Madrid (Spain); Haste, Tim [Centre d' Etudes de Cadarache, Paul-Lez-Durance (France). Institut de Radioprotection et de Surete Nucleaire (IRSN); Kaerkelae, Teemu [VTT Technical Research Centre of Finland Ltd, Espoo (Finland)

    2016-12-15

    Source term research has continued internationally for more than 30 years, increasing confidence in calculations of the potential radioactive release to the environment after a severe reactor accident. Important experimental data have been obtained, mainly under international frameworks such as OECD/NEA and EURATOM. Specifically, Phebus FP provides major insights into fission product release and transport. Results are included in severe accident analysis codes. Data from international projects are being interpreted with a view to further improvements in these codes. This paper synthesizes the recent main outcomes from source term research on these topics, and on source term mitigation. It highlights knowledge gaps remaining and discusses ways to proceed. Aside from this further knowledge-driven research, there is consensus on the need to assess the source term predictive ability of current system codes, taking account of scale-up from experiment to reactor conditions.

  2. Systems analysis approach to probabilistic modeling of fault trees

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Qualls, C.R.

    1985-01-01

    A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems

  3. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  4. Use of a probabilistic safety study in the design of the Italian reference PWR

    International Nuclear Information System (INIS)

    Richardson, D.C.; Russino, G.; Valentini, V.

    1985-01-01

    The intent of this paper is to provide a description of the experience gained in having performed a Probabilistic Safety Study (PSS) on the proposed Italian reference pressurized water reactor. The experience revealed that through careful application of probabilistic techniques, Probabilistic Risk Assessment (PRA) can be used as a tool to develop an optimum plant design in terms of safety and cost. Furthermore, the PSS can also be maintained as a living document and a tool to assess additional regulatory requirements that may be imposed during the construction and operational life of the plant. Through the use of flexible probabilistic techniques, the probabilistic safety model can provide a living safety assessment starting from the conceptual design and continuing through the construction, testing and operational phases. Moreover, the probabilistic safety model can be used during the operational phase of the plant as a method to evaluate the operational experience and identify potential problems before they occur. The experience, overall, provided additional insights into the various aspects of the plants design and operation that would not have been identified through the use of traditional safety evaluation techniques

  5. Source terms derived from analyses of hypothetical accidents, 1950-1986

    International Nuclear Information System (INIS)

    Stratton, W.R.

    1987-01-01

    This paper reviews the history of reactor accident source term assumptions. After the Three Mile Island accident, a number of theoretical and experimental studies re-examined possible accident sequences and source terms. Some of these results are summarized in this paper

  6. Spent fuel assembly source term parameters

    International Nuclear Information System (INIS)

    Barrett, P.R.; Foadian, H.; Rashid, Y.R.; Seager, K.D.; Gianoulakis, S.E.

    1993-01-01

    Containment of cask contents by a transport cask is a function of the cask body, one or more closure lids, and various bolting hardware, and seals associated with the cavity closure and other containment penetrations. In addition, characteristics of cask contents that impede the ability of radionuclides to move from an origin to the external environment also provide containment. In essence, multiple release barriers exist in series in transport casks, and the magnitude of the releasable activity in the cask is considerably lower than the total activity of its contents. A source term approach accounts for the magnitude of the releasable activity available in the cask by assessing the degree of barrier resistance to release provided by material characteristics and inherent barriers that impede the release of radioactive contents. Standardized methodologies for defining the spent-fuel transport packages with specified regulations have recently been developed. An essential part of applying the source term methodology involves characterizing the response of the spent fuel under regulatory conditions of transport. Thermal and structural models of the cask and fuel are analyzed and used to predict fuel rod failure probabilities. Input to these analyses and failure evaluations cover a wide range of geometrical and material properties. An important issue in the development of these models is the sensitivity of the radioactive source term generated during transport to individual parameters such as temperature and fluence level. This paper provides a summary of sensitivity analyses concentrating on the structural response and failure predictions of the spent fuel assemblies

  7. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    novel automated method to investigate the significance of spatial b-value variations. The method incorporates an objective data-driven partitioning scheme, which is based on penalized likelihood estimates. These well-defined criteria avoid the difficult choice of commonly applied spatial mapping parameters, such as grid spacing or size of mapping radii. We construct a seismicity forecast that includes spatial b-value variations and demonstrate our model’s skill and reliability when applied to data from California. All proposed probabilistic seismicity forecasts were subjected to evaluation methods using state of the art algorithms provided by the 'Collaboratory for the Study of Earthquake Predictability' infrastructure. First, we evaluated the statistical agreement between the forecasted and observed rates of target events in terms of number, space and magnitude. Secondly, we assessed the performance of one forecast relative to another. We find that the forecasts presented in this thesis are reliable and show significant skills with respect to established classical forecasts. These next-generation probabilistic seismicity forecasts can thus provide hazard information that are potentially useful in reducing earthquake losses and enhancing community preparedness and resilience. (author)

  8. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    novel automated method to investigate the significance of spatial b-value variations. The method incorporates an objective data-driven partitioning scheme, which is based on penalized likelihood estimates. These well-defined criteria avoid the difficult choice of commonly applied spatial mapping parameters, such as grid spacing or size of mapping radii. We construct a seismicity forecast that includes spatial b-value variations and demonstrate our model’s skill and reliability when applied to data from California. All proposed probabilistic seismicity forecasts were subjected to evaluation methods using state of the art algorithms provided by the 'Collaboratory for the Study of Earthquake Predictability' infrastructure. First, we evaluated the statistical agreement between the forecasted and observed rates of target events in terms of number, space and magnitude. Secondly, we assessed the performance of one forecast relative to another. We find that the forecasts presented in this thesis are reliable and show significant skills with respect to established classical forecasts. These next-generation probabilistic seismicity forecasts can thus provide hazard information that are potentially useful in reducing earthquake losses and enhancing community preparedness and resilience. (author)

  9. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  10. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modeling and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.

  11. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  12. Towards a multilevel cognitive probabilistic representation of space

    Science.gov (United States)

    Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland

    2005-03-01

    This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.

  13. An Individual-based Probabilistic Model for Fish Stock Simulation

    Directory of Open Access Journals (Sweden)

    Federico Buti

    2010-08-01

    Full Text Available We define an individual-based probabilistic model of a sole (Solea solea behaviour. The individual model is given in terms of an Extended Probabilistic Discrete Timed Automaton (EPDTA, a new formalism that is introduced in the paper and that is shown to be interpretable as a Markov decision process. A given EPDTA model can be probabilistically model-checked by giving a suitable translation into syntax accepted by existing model-checkers. In order to simulate the dynamics of a given population of soles in different environmental scenarios, an agent-based simulation environment is defined in which each agent implements the behaviour of the given EPDTA model. By varying the probabilities and the characteristic functions embedded in the EPDTA model it is possible to represent different scenarios and to tune the model itself by comparing the results of the simulations with real data about the sole stock in the North Adriatic sea, available from the recent project SoleMon. The simulator is presented and made available for its adaptation to other species.

  14. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  15. Release modes and processes relevant to source-term calculations at Yucca Mountain

    International Nuclear Information System (INIS)

    Apted, M.J.

    1994-01-01

    The feasibility of permanent disposal of radioactive high-level waste (HLW) in repositories located in deep geologic formations is being studied world-wide. The most credible release pathway is interaction between groundwater and nuclear waste forms, followed by migration of radionuclide-bearing groundwater to the accessible environment. Under hydrologically unsaturated conditions, vapor transport of volatile radionuclides is also possible. The near-field encompasses the waste packages composed of engineered barriers (e.g. man-made materials, such as vitrified waste forms, corrosion-resistant containers), while the far-field includes the natural barriers (e.g. host rock, hydrologic setting). Taken together, these two subsystems define a series of multiple, redundant barriers that act to assure the safe isolation of nuclear waste. In the U.S., the Department of energy (DOE) is investigating the feasibility of safe, long-term disposal of high-level nuclear waste at the Yucca Mountain site in Nevada. The proposed repository horizon is located in non-welded tuffs within the unsaturated zone (i.e. above the water table) at Yucca Mountain. The purpose of this paper is to describe the source-term models for radionuclide release from waste packages at Yucca Mountain site. The first section describes the conceptual release modes that are relevant for this site and waste package design, based on a consideration of the performance of currently proposed engineered barriers under expected and unexpected conditions. No attempt is made to asses the reasonableness nor probability of occurrence for any specific release mode. The following section reviews the waste-form characteristics that are required to model and constrain the release of radionuclides from the waste package. The next section present mathematical models for the conceptual release modes, selected from those that have been implemented into a probabilistic total system assessment code developed for the Electric Power

  16. Development of a Probabilistic Flood Hazard Assessment (PFHA) for the nuclear safety

    Science.gov (United States)

    Ben Daoued, Amine; Guimier, Laurent; Hamdi, Yasser; Duluc, Claire-Marie; Rebour, Vincent

    2016-04-01

    The purpose of this study is to lay the basis for a probabilistic evaluation of flood hazard (PFHA). Probabilistic assessment of external floods has become a current topic of interest to the nuclear scientific community. Probabilistic approaches complement deterministic approaches by exploring a set of scenarios and associating a probability to each of them. These approaches aim to identify all possible failure scenarios, combining their probability, in order to cover all possible sources of risk. They are based on the distributions of initiators and/or the variables caracterizing these initiators. The PFHA can characterize the water level for example at defined point of interest in the nuclear site. This probabilistic flood hazard characterization takes into account all the phenomena that can contribute to the flooding of the site. The main steps of the PFHA are: i) identification of flooding phenomena (rains, sea water level, etc.) and screening of relevant phenomena to the nuclear site, ii) identification and probabilization of parameters associated to selected flooding phenomena, iii) spreading of the probabilized parameters from the source to the point of interest in the site, v) obtaining hazard curves and aggregation of flooding phenomena contributions at the point of interest taking into account the uncertainties. Within this framework, the methodology of the PFHA has been developed for several flooding phenomena (rain and/or sea water level, etc.) and then implemented and tested with a simplified case study. In the same logic, our study is still in progress to take into account other flooding phenomena and to carry out more case studies.

  17. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  18. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  19. Representation of human behaviour in probabilistic safety analysis

    International Nuclear Information System (INIS)

    Whittingham, R.B.

    1991-01-01

    This paper provides an overview of the representation of human behaviour in probabilistic safety assessment. Human performance problems which may result in errors leading to accidents are considered in terms of methods of identification using task analysis, screening analysis of critical errors, representation and quantification of human errors in fault trees and event trees and error reduction measures. (author) figs., tabs., 43 refs

  20. A probabilistic assessment of large scale wind power development for long-term energy resource planning

    Science.gov (United States)

    Kennedy, Scott Warren

    A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable

  1. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  2. Quantitative probabilistic functional diffusion mapping in newly diagnosed glioblastoma treated with radiochemotherapy.

    Science.gov (United States)

    Ellingson, Benjamin M; Cloughesy, Timothy F; Lai, Albert; Nghiemphu, Phioanh L; Liau, Linda M; Pope, Whitney B

    2013-03-01

    Functional diffusion mapping (fDM) is a cancer imaging technique that uses voxel-wise changes in apparent diffusion coefficients (ADC) to evaluate response to treatment. Despite promising initial results, uncertainty in image registration remains the largest barrier to widespread clinical application. The current study introduces a probabilistic approach to fDM quantification to overcome some of these limitations. A total of 143 patients with newly diagnosed glioblastoma who were undergoing standard radiochemotherapy were enrolled in this retrospective study. Traditional and probabilistic fDMs were calculated using ADC maps acquired before and after therapy. Probabilistic fDMs were calculated by applying random, finite translational, and rotational perturbations to both pre-and posttherapy ADC maps, then repeating calculation of fDMs reflecting changes after treatment, resulting in probabilistic fDMs showing the voxel-wise probability of fDM classification. Probabilistic fDMs were then compared with traditional fDMs in their ability to predict progression-free survival (PFS) and overall survival (OS). Probabilistic fDMs applied to patients with newly diagnosed glioblastoma treated with radiochemotherapy demonstrated shortened PFS and OS among patients with a large volume of tumor with decreasing ADC evaluated at the posttreatment time with respect to the baseline scans. Alternatively, patients with a large volume of tumor with increasing ADC evaluated at the posttreatment time with respect to baseline scans were more likely to progress later and live longer. Probabilistic fDMs performed better than traditional fDMs at predicting 12-month PFS and 24-month OS with use of receiver-operator characteristic analysis. Univariate log-rank analysis on Kaplan-Meier data also revealed that probabilistic fDMs could better separate patients on the basis of PFS and OS, compared with traditional fDMs. Results suggest that probabilistic fDMs are a more predictive biomarker in

  3. Spallation Neutron Source Accident Terms for Environmental Impact Statement Input

    Energy Technology Data Exchange (ETDEWEB)

    Devore, J.R.; Harrington, R.M.

    1998-08-01

    This report is about accidents with the potential to release radioactive materials into the environment surrounding the Spallation Neutron Source (SNS). As shown in Chap. 2, the inventories of radioactivity at the SNS are dominated by the target facility. Source terms for a wide range of target facility accidents, from anticipated events to worst-case beyond-design-basis events, are provided in Chaps. 3 and 4. The most important criterion applied to these accident source terms is that they should not underestimate potential release. Therefore, conservative methodology was employed for the release estimates. Although the source terms are very conservative, excessive conservatism has been avoided by basing the releases on physical principles. Since it is envisioned that the SNS facility may eventually (after about 10 years) be expanded and modified to support a 4-MW proton beam operational capability, the source terms estimated in this report are applicable to a 4-MW operating proton beam power unless otherwise specified. This is bounding with regard to the 1-MW facility that will be built and operated initially. See further discussion below in Sect. 1.2.

  4. Source term estimation during incident response to severe nuclear power plant accidents

    International Nuclear Information System (INIS)

    McKenna, T.J.; Glitter, J.G.

    1988-10-01

    This document presents a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. 39 refs., 48 figs., 19 tabs

  5. Flowsheets and source terms for radioactive waste projections

    International Nuclear Information System (INIS)

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF 6 conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables

  6. Risk assessment in long-term storage of spent nuclear fuel

    International Nuclear Information System (INIS)

    Ahn, T.; Guttmann, J.; Mohseni, A.

    2013-01-01

    This paper presents probabilistic risk-informed approaches that the Nuclear Regulatory Commission (NRC) staff is planning to consider in preparing regulatory bases for long-term storage of spent nuclear fuel (SNF) for up to 300 years. Due to uncertainties associated with long-term SNF storage, the NRC is considering a probabilistic risk-informed approach as well as a deterministic design-based approach. The uncertainties considered here are primarily associated with materials aging of the canister and SNF in the cask system during long-term storage of SNF. This paper discusses some potential risk contributors involved in long-term SNF storage. Methods of performance evaluation are presented that assess the various types of risks involved. They include deterministic evaluation, probabilistic evaluation, and consequence assessment under normal conditions and the conditions of accidents and natural hazards. Some potentially important technical issues resulting from the consideration of a probabilistic risk-informed evaluation of the cask system performance are discussed for the canister and SNF integrity. These issues are also discussed in comparison with the deterministic approach for comparison purposes, as appropriate. Probabilistic risk-informed methods can provide insights that deterministic methods may not capture. Two specific examples include stress corrosion cracking of the canister and hydrogen-induced cladding failure. These examples are discussed in more detail, in terms of their effects on radionuclide release and nuclear subcriticality associated with the failure. The plan to consider the probabilistic risk-informed approaches is anticipated to provide helpful regulatory insights for long-term storage of SNF that provide reasonable assurance for public health and safety. (authors)

  7. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  8. Relation between source term and emergency planning for nuclear power plants

    International Nuclear Information System (INIS)

    Shi Zhongqi; Yang Ling

    1992-01-01

    Some background information of the severe accidents and source terms related to the nuclear power plant emergency planning are presented. The new source term information in NUREG-0956 and NUREG-1150, and possible changes in emergency planning requirements in U.S.A. are briefly provided. It is suggested that a principle is used in selecting source terms for establishing the emergency planning policy and a method is used in determining the Emergency Planning Zone (EPZ) size in China. Based on the research results of (1) EPZ size of PWR nuclear power plants being built in China, and (2) impact of reactor size and selected source terms on the EPZ size, it is concluded that the suggested principle and the method are suitable and feasible for PWR nuclear power plants in China

  9. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  10. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  11. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  12. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  13. A Sensitivity Study for an Evaluation of Input Parameters Effect on a Preliminary Probabilistic Tsunami Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyun-Me; Kim, Min Kyu; Choi, In-Kil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Dong-Hoon [Chonnam National University, Gwangju (Korea, Republic of)

    2014-10-15

    The tsunami hazard analysis has been based on the seismic hazard analysis. The seismic hazard analysis has been performed by using the deterministic method and the probabilistic method. To consider the uncertainties in hazard analysis, the probabilistic method has been regarded as attractive approach. The various parameters and their weight are considered by using the logic tree approach in the probabilistic method. The uncertainties of parameters should be suggested by analyzing the sensitivity because the various parameters are used in the hazard analysis. To apply the probabilistic tsunami hazard analysis, the preliminary study for the Ulchin NPP site had been performed. The information on the fault sources which was published by the Atomic Energy Society of Japan (AESJ) had been used in the preliminary study. The tsunami propagation was simulated by using the TSUNAMI{sub 1}.0 which was developed by Japan Nuclear Energy Safety Organization (JNES). The wave parameters have been estimated from the result of tsunami simulation. In this study, the sensitivity analysis for the fault sources which were selected in the previous studies has been performed. To analyze the effect of the parameters, the sensitivity analysis for the E3 fault source which was published by AESJ was performed. The effect of the recurrence interval, the potential maximum magnitude, and the beta were suggested by the sensitivity analysis results. Level of annual exceedance probability has been affected by the recurrence interval.. Wave heights have been influenced by the potential maximum magnitude and the beta. In the future, the sensitivity analysis for the all fault sources in the western part of Japan which were published AESJ would be performed.

  14. Evaluation of seismic reliability of steel moment resisting frames rehabilitated by concentric braces with probabilistic models

    Directory of Open Access Journals (Sweden)

    Fateme Rezaei

    2017-08-01

    Full Text Available Probability of structure failure which has been designed by "deterministic methods" can be more than the one which has been designed in similar situation using probabilistic methods and models considering "uncertainties". The main purpose of this research was to evaluate the seismic reliability of steel moment resisting frames rehabilitated with concentric braces by probabilistic models. To do so, three-story and nine-story steel moment resisting frames were designed based on resistant criteria of Iranian code and then they were rehabilitated based on controlling drift limitations by concentric braces. Probability of frames failure was evaluated by probabilistic models of magnitude, location of earthquake, ground shaking intensity in the area of the structure, probabilistic model of building response (based on maximum lateral roof displacement and probabilistic methods. These frames were analyzed under subcrustal source by sampling probabilistic method "Risk Tools" (RT. Comparing the exceedance probability of building response curves (or selected points on it of the three-story and nine-story model frames (before and after rehabilitation, seismic response of rehabilitated frames, was reduced and their reliability was improved. Also the main effective variables in reducing the probability of frames failure were determined using sensitivity analysis by FORM probabilistic method. The most effective variables reducing the probability of frames failure are  in the magnitude model, ground shaking intensity model error and magnitude model error

  15. Probabilistic data integration and computational complexity

    Science.gov (United States)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under

  16. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  17. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  18. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  19. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  20. Level II Probabilistic Safety Analysis Methodology for the Application to GEN-IV Sodium-cooled Fast Reactor

    International Nuclear Information System (INIS)

    Park, S. Y.; Kim, T. W.; Han, S. H.; Jeong, H. Y.

    2010-03-01

    The Korea Atomic Energy Research Institute (KAERI) has been developing liquid metal reactor (LMR) design technologies under a National Nuclear R and D Program. Nevertheless, there is no experience of the probabilistic safety assessment (PSA) domestically for a fast reactor with the metal fuel. Therefore, the objective of this study is to establish the methodologies of risk assessment for the reference design of GEN-IV sodium fast reactor (SFR). An applicability of the PSA methodology of U. S. NRC and PRISM plant to the domestic GEN-IV SFR has been studied. The study contains a plant damage state analysis, a containment event tree analysis, and a source-term release category binning process

  1. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    Science.gov (United States)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  2. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  3. Source term estimation during incident response to severe nuclear power plant accidents. Draft

    Energy Technology Data Exchange (ETDEWEB)

    McKenna, T J; Giitter, J

    1987-07-01

    The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. The goal is to present a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. (author)

  4. Source term estimation during incident response to severe nuclear power plant accidents. Draft

    International Nuclear Information System (INIS)

    McKenna, T.J.; Giitter, J.

    1987-01-01

    The various methods of estimating radionuclide release to the environment (source terms) as a result of an accident at a nuclear power reactor are discussed. The major factors affecting potential radionuclide releases off site (source terms) as a result of nuclear power plant accidents are described. The quantification of these factors based on plant instrumentation also is discussed. A range of accident conditions from those within the design basis to the most severe accidents possible are included in the text. A method of gross estimation of accident source terms and their consequences off site is presented. The goal is to present a method of source term estimation that reflects the current understanding of source term behavior and that can be used during an event. (author)

  5. US Department of Energy Approach to Probabilistic Evaluation of Long-Term Safety for a Potential Yucca Mountain Repository

    International Nuclear Information System (INIS)

    Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik

    2005-01-01

    Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety

  6. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  7. Probabilistic methods for seasonal forecasting in a changing climate: Cox-type regression models

    NARCIS (Netherlands)

    Maia, A.H.N.; Meinke, H.B.

    2010-01-01

    For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative

  8. Determination of source term for Krsko NPP extended fuel cycle

    International Nuclear Information System (INIS)

    Nemec, T.; Persic, A.; Zagar, T.; Zefran, B.

    2004-01-01

    The activity and composition of the potential radioactive releases (source term) is important in the decision making about off-site emergency measures in case of a release into environment. Power uprate of Krsko NPP during modernization in 2000 as well as changing of the fuel type and the core design have influenced the source term value. In 2003 a project of 'Jozef Stefan' Institute and Slovenian nuclear safety administration determined a plantspecific source term for new conditions of fuel type and burnup for extended fuel cycle. Calculations of activity and isotopic composition of the core have been performed with ORIGEN-ARP program. Results showed that the core activity for extended 15 months fuel cycle is slightly lower than for the 12 months cycles, mainly due to larger share of fresh fuel. (author)

  9. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  10. Accident source terms for boiling water reactors with high burnup cores.

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O.; Powers, Dana Auburn; Leonard, Mark Thomas

    2007-11-01

    The primary objective of this report is to provide the technical basis for development of recommendations for updates to the NUREG-1465 Source Term for BWRs that will extend its applicability to accidents involving high burnup (HBU) cores. However, a secondary objective is to re-examine the fundamental characteristics of the prescription for fission product release to containment described by NUREG-1465. This secondary objective is motivated by an interest to understand the extent to which research into the release and behaviors of radionuclides under accident conditions has altered best-estimate calculations of the integral response of BWRs to severe core damage sequences and the resulting radiological source terms to containment. This report, therefore, documents specific results of fission product source term analyses that will form the basis for the HBU supplement to NUREG-1465. However, commentary is also provided on observed differences between the composite results of the source term calculations performed here and those reflected NUREG-1465 itself.

  11. Selective application of revised source terms to operating nuclear power plants

    International Nuclear Information System (INIS)

    Moon, Joo Hyun; Song, Jae Hyuk; Lee, Young Wook; Ko, Hyun Seok; Kang, Chang Sun

    2001-01-01

    More than 30 years later since 1962 when TID-14844 was promulgated, there has been big change of the US NRC's regulatory position in using accident source terms for radiological assessment following a design basis accident (DBA). To replace the instantaneous source terms of TID-14844, the time-dependent source terms of NUREG-1465 was published in 1995. In the meantime, the radiological acceptance criteria for reactor site evaluation in 10 CFR Part 100 were also revised. In particular, the concept of total effective dose equivalent has been incorporated in accordance with the radiation protection standards set forth in revised 10 CFR Part 20. Subsequently, the publication of Regulatory Guide 1.183 and the revision of Standard Review Plan 15.0.1 followed in 2000, which provided the licensee of operating nuclear power reactor with the acceptable guidance of applying the revised source term. The guidance allowed the holder of an operating license issued prior to January 10, 1997 to voluntarily revise the accident source terms used in the radiological consequence analyses of DBA. Regarding to its type of application, there suggested full and selective applications, Whether it is full or selective, based upon the scope and nature of associated plant modifications being proposed, the actual application of the revised source terms to an operating plant is expected to give a large impact on its facility design basis. Considering scope and cost of the analyses required for licensing, selective application is seemed to be more appealing to an licensee of the operating plant rather than full application. In this paper, hence, the selective application methodology is reviewed and is actally applied to the assessment of offsite radiological consequence following a LOCA at Ulchin Unit 3 and 4, in order to identify and analyze the potential impacts due to application of revised source terms and to assess the considerations taken in each application prior to its actual

  12. Short-term Probabilistic Forecasting of Wind Speed Using Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Iversen, Jan Emil Banning; Morales González, Juan Miguel; Møller, Jan Kloppenborg

    2016-01-01

    and uncertain nature. In this paper, we propose a modeling framework for wind speed that is based on stochastic differential equations. We show that stochastic differential equations allow us to naturally capture the time dependence structure of wind speed prediction errors (from 1 up to 24 hours ahead) and......It is widely accepted today that probabilistic forecasts of wind power production constitute valuable information for both wind power producers and power system operators to economically exploit this form of renewable energy, while mitigating the potential adverse effects related to its variable......, most importantly, to derive point and quantile forecasts, predictive distributions, and time-path trajectories (also referred to as scenarios or ensemble forecasts), all by one single stochastic differential equation model characterized by a few parameters....

  13. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  14. Directional Unfolded Source Term (DUST) for Compton Cameras.

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Dean J.; Mitchell, Dean J.; Horne, Steven M.; O' Brien, Sean; Thoreson, Gregory G

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  15. Probabilistic Tractography of the Cranial Nerves in Vestibular Schwannoma.

    Science.gov (United States)

    Zolal, Amir; Juratli, Tareq A; Podlesek, Dino; Rieger, Bernhard; Kitzler, Hagen H; Linn, Jennifer; Schackert, Gabriele; Sobottka, Stephan B

    2017-11-01

    Multiple recent studies have reported on diffusion tensor-based fiber tracking of cranial nerves in vestibular schwannoma, with conflicting results as to the accuracy of the method and the occurrence of cochlear nerve depiction. Probabilistic nontensor-based tractography might offer advantages in terms of better extraction of directional information from the underlying data in cranial nerves, which are of subvoxel size. Twenty-one patients with large vestibular schwannomas were recruited. The probabilistic tracking was run preoperatively and the position of the potential depictions of the facial and cochlear nerves was estimated postoperatively by 3 independent observers in a blinded fashion. The true position of the nerve was determined intraoperatively by the surgeon. Thereafter, the imaging-based estimated position was compared with the intraoperatively determined position. Tumor size, cystic appearance, and postoperative House-Brackmann score were analyzed with regard to the accuracy of the depiction of the nerves. The probabilistic tracking showed a connection that correlated to the position of the facial nerve in 81% of the cases and to the position of the cochlear nerve in 33% of the cases. Altogether, the resulting depiction did not correspond to the intraoperative position of any of the nerves in 3 cases. In a majority of cases, the position of the facial nerve, but not of the cochlear nerve, could be estimated by evaluation of the probabilistic tracking results. However, false depictions not corresponding to any nerve do occur and cannot be discerned as such from the image only. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  17. A probabilistic approach of sum rules for heat polynomials

    International Nuclear Information System (INIS)

    Vignat, C; Lévêque, O

    2012-01-01

    In this paper, we show that the sum rules for generalized Hermite polynomials derived by Daboul and Mizrahi (2005 J. Phys. A: Math. Gen. http://dx.doi.org/10.1088/0305-4470/38/2/010) and by Graczyk and Nowak (2004 C. R. Acad. Sci., Ser. 1 338 849) can be interpreted and easily recovered using a probabilistic moment representation of these polynomials. The covariance property of the raising operator of the harmonic oscillator, which is at the origin of the identities proved in Daboul and Mizrahi and the dimension reduction effect expressed in the main result of Graczyk and Nowak are both interpreted in terms of the rotational invariance of the Gaussian distributions. As an application of these results, we uncover a probabilistic moment interpretation of two classical integrals of the Wigner function that involve the associated Laguerre polynomials. (paper)

  18. Development of source term evaluation method for Korean Next Generation Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Keon Jae; Cheong, Jae Hak; Park, Jin Baek; Kim, Guk Gee [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-10-15

    This project had investigate several design features of radioactive waste processing system and method to predict nuclide concentration at primary coolant basic concept of next generation reactor and safety goals at the former phase. In this project several prediction methods of source term are evaluated conglomerately and detailed contents of this project are : model evaluation of nuclide concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant(NPP), investigation of prediction parameter of source term evaluation, basic parameter of PWR, operational parameter, respectively, radionuclide removal system and adjustment values of reference NPP, suggestion of source term prediction method of next generation NPP.

  19. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  20. Severe accident source term reassessment

    International Nuclear Information System (INIS)

    Hazzan, M.J.; Gardner, R.; Warman, E.A.; Jacobs, S.B.

    1987-01-01

    This paper summarizes the status of the reassessment of severe reactor accident source terms, which are defined as the quantity, type, and timing of fission product releases from such accidents. Concentration is on the major results and conclusions of analyses with modern methods for both pressurized water reactors (PWRs) and boiling water reactors (BWRs), and the special case of containment bypass. Some distinctions are drawn between analyses for PWRs and BWRs. In general, the more the matter is examined, the consequences, or probability of serious consequences, seem to be less. (author)

  1. Source term estimation via monitoring data and its implementation to the RODOS system

    International Nuclear Information System (INIS)

    Bohunova, J.; Duranova, T.

    2000-01-01

    A methodology and computer code for interpretation of environmental data, i.e. source term assessment, from on-line environmental monitoring network was developed. The method is based on the conversion of measured dose rates to the source term, i.e. airborne radioactivity release rate, taking into account real meteorological data and location of the monitoring points. The bootstrap estimation methodology and bipivot method to estimate the source term from on-site gamma dose rate monitors is used. The mentioned methods provide an estimate of the mean value of the source term and a confidence interval for it. (author)

  2. Source term estimation for small sized HTRs

    International Nuclear Information System (INIS)

    Moormann, R.

    1992-08-01

    Accidents which have to be considered are core heat-up, reactivity transients, water of air ingress and primary circuit depressurization. The main effort of this paper belongs to water/air ingress and depressurization, which requires consideration of fission product plateout under normal operation conditions; for the latter it is clearly shown, that absorption (penetration) mechanisms are much less important than assumed sometimes in the past. Source term estimation procedures for core heat-up events are shortly reviewed; reactivity transients are apparently covered by them. Besides a general literature survey including identification of areas with insufficient knowledge this paper contains some estimations on the thermomechanical behaviour of fission products in water in air ingress accidents. Typical source term examples are also presented. In an appendix, evaluations of the AVR experiments VAMPYR-I and -II with respect to plateout and fission product filter efficiency are outlined and used for a validation step of the new plateout code SPATRA. (orig.)

  3. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  4. A Methodology for Probabilistic Accident Management

    International Nuclear Information System (INIS)

    Munteanu, Ion; Aldemir, Tunc

    2003-01-01

    While techniques have been developed to tackle different tasks in accident management, there have been very few attempts to develop an on-line operator assistance tool for accident management and none that can be found in the literature that uses probabilistic arguments, which are important in today's licensing climate. The state/parameter estimation capability of the dynamic system doctor (DSD) approach is combined with the dynamic event-tree generation capability of the integrated safety assessment (ISA) methodology to address this issue. The DSD uses the cell-to-cell mapping technique for system representation that models the system evolution in terms of probability of transitions in time between sets of user-defined parameter/state variable magnitude intervals (cells) within a user-specified time interval (e.g., data sampling interval). The cell-to-cell transition probabilities are obtained from the given system model. The ISA follows the system dynamics in tree form and braches every time a setpoint for system/operator intervention is exceeded. The combined approach (a) can automatically account for uncertainties in the monitored system state, inputs, and modeling uncertainties through the appropriate choice of the cells, as well as providing a probabilistic measure to rank the likelihood of possible system states in view of these uncertainties; (b) allows flexibility in system representation; (c) yields the lower and upper bounds on the estimated values of state variables/parameters as well as their expected values; and (d) leads to fewer branchings in the dynamic event-tree generation. Using a simple but realistic pressurizer model, the potential use of the DSD-ISA methodology for on-line probabilistic accident management is illustrated

  5. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  6. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  7. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method.

    Science.gov (United States)

    Valentin, Jan B; Andreetta, Christian; Boomsma, Wouter; Bottaro, Sandro; Ferkinghoff-Borg, Jesper; Frellsen, Jes; Mardia, Kanti V; Tian, Pengfei; Hamelryck, Thomas

    2014-02-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length scale, which concern the dihedral angles in main chain and side chains, respectively. Conceptually, this constitutes a probabilistic and continuous alternative to the use of discrete fragment and rotamer libraries. The local model is combined with a nonlocal model that involves a small number of energy terms according to a physical force field, and some information on the overall secondary structure content. In this initial study we focus on the formulation of the joint model and the evaluation of the use of an energy vector as a descriptor of a protein's nonlocal structure; hence, we derive the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins. The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. Copyright © 2013 Wiley Periodicals, Inc.

  8. Trading wind generation from short-term probabilistic forecasts of wind power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Chevallier, Christophe; Kariniotakis, Georges

    2007-01-01

    Due to the fluctuating nature of the wind resource, a wind power producer participating in a liberalized electricity market is subject to penalties related to regulation costs. Accurate forecasts of wind generation are therefore paramount for reducing such penalties and thus maximizing revenue......, as well as on modeling of the sensitivity a wind power producer may have to regulation costs. The benefits resulting from the application of these strategies are clearly demonstrated on the test case of the participation of a multi-MW wind farm in the Dutch electricity market over a year....... participation. Such strategies permit to further increase revenues and thus enhance competitiveness of wind generation compared to other forms of dispatchable generation. This paper formulates a general methodology for deriving optimal bidding strategies based on probabilistic forecasts of wind generation...

  9. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  10. Probabilistically modeling lava flows with MOLASSES

    Science.gov (United States)

    Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.

    2017-12-01

    Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.

  11. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  12. Probabilistic Seismic Risk Assessment in Manizales, Colombia:Quantifying Losses for Insurance Purposes

    Institute of Scientific and Technical Information of China (English)

    Mario A.Salgado-Gálvez; Gabriel A.Bernal; Daniela Zuloaga; Mabel C.Marulanda; Omar-Darío Cardona; Sebastián Henao

    2017-01-01

    A fully probabilistic seismic risk assessment was developed in Manizales,Colombia,considering assets of different types.The first type includes elements that are part of the water and sewage network,and the second type includes public and private buildings.This assessment required the development of a probabilistic seismic hazard analysis that accounts for the dynamic soil response,assembling high resolution exposure databases,and the development of damage models for different types of elements.The economic appraisal of the exposed assets was developed together with specialists of the water utilities company of Manizales and the city administration.The risk assessment was performed using several Comprehensive Approach to Probabilistic Risk Assessment modules as well as the R-System,obtaining results in terms of traditional metrics such as loss exceedance curve,average annual loss,and probable maximum loss.For the case of pipelines,repair rates were also estimated.The results for the water and sewage network were used in activities related to the expansion and maintenance strategies,as well as for the exploration of financial retention and transfer alternatives using insurance schemes based on technical,probabilistic,and prospective damage and loss estimations.In the case of the buildings,the results were used in the update of the technical premium values of the existing collective insurance scheme.

  13. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  14. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  15. Development of source term evaluation method for Korean Next Generation Reactor(III)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Geon Jae; Park, Jin Baek; Lee, Yeong Il; Song, Min Cheonl; Lee, Ho Jin [Korea Advanced Institue of Science and Technology, Taejon (Korea, Republic of)

    1998-06-15

    This project had investigated irradiation characteristics of MOX fuel method to predict nuclide concentration at primary and secondary coolant using a core containing 100% of all MOX fuel and development of source term evaluation tool. In this study, several prediction methods of source term are evaluated. Detailed contents of this project are : an evaluation of model for nuclear concentration at Reactor Coolant System, evaluation of primary and secondary coolant concentration of reference Nuclear Power Plant using purely MOX fuel, suggestion of source term prediction method of NPP with a core using MOX fuel.

  16. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  17. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  18. STACE: Source Term Analyses for Containment Evaluations of transport casks

    International Nuclear Information System (INIS)

    Seager, K.D.; Gianoulakis, S.E.; Barrett, P.R.; Rashid, Y.R.; Reardon, P.C.

    1992-01-01

    Following the guidance of ANSI N14.5, the STACE methodology provides a technically defensible means for estimating maximum permissible leakage rates. These containment criteria attempt to reflect the true radiological hazard by performing a detailed examination of the spent fuel, CRUD, and residual contamination contributions to the releasable source term. The evaluation of the spent fuel contribution to the source term has been modeled fairly accurately using the STACE methodology. The structural model predicts the cask drop load history, the mechanical response of the fuel assembly, and the probability of cladding breach. These data are then used to predict the amount of fission gas, volatile species, and fuel fines that are releasable from the cask. There are some areas where data are sparse or lacking (e.g., the quantity and size distribution of fuel rod breaches) in which experimental validation is planned. The CRUD spallation fraction is the major area where no quantitative data has been found; therefore, this also requires experimental validation. In the interim, STACE conservatively assumes a 100% spallation fraction for computing the releasable activity. The source term methodology also conservatively assumes that there is 1 Ci of residual contamination available for release in the transport cask. However, residual contamination is still by far the smallest contributor to the source term activity

  19. Perspectives on source terms based on early research and development

    International Nuclear Information System (INIS)

    Pressesky, A.J.

    1985-07-01

    This report presents an overview of the key documentation of the research and development programs relevant to the source term issue which were undertaken by the Atomic Energy Commission between 1950 and 1970. The source term is taken to be the amount, composition (physical and chemical), and timing of the projected release of radioactivity to the environment in the hypothetical event of a severe reactor accident in a light water reactor of the type currently being licensed, built and operated. The objective is to illuminate and provide perspectives on (a) the maturity of the technical data base and the analytical methodology, (b) the extent to which remaining conservatisms can be applied to compensate for uncertainties, (c) the purpose for which the technology and methodology will be used, and (d) the need to keep problems and uncertainties in proper perspective. Comments that can provide some context for the difficult programmatic choices to be made are included, and technical considerations that may be inadequately applied or neglected in some current source term calculations were studied. This review has not uncovered any significant technical considerations that have been omitted or are being inadequately treated in current source term analyses, except perhaps the contribution made to in-containment aerosols by coolant comminution upon escape at pressure from the reactor coolant system. 11 refs

  20. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2016-02-11

    In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgrid system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.

  1. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  2. Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)

    Science.gov (United States)

    Rahmani, E.; Hense, A.

    2017-12-01

    Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.

  3. Source term determination from subcritical multiplication measurements at Koral-1 reactor

    International Nuclear Information System (INIS)

    Blazquez, J.B.; Barrado, J.M.

    1978-01-01

    By using an AmBe neutron source two independent procedures have been settled for the zero-power experimental fast-reactor Coral-1 in order to measure the source term which appears in the point kinetical equations. In the first one, the source term is measured when the reactor is just critical with source by taking advantage of the wide range of the linear approach to critical for Coral-1. In the second one, the measurement is made in subcritical state by making use of the previous calibrated control rods. Several applications are also included such as the measurement of the detector dead time, the determinations of the reactivity of small samples and the shape of the neutron importance of the source. (author)

  4. Prospects for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.

    1992-01-01

    This article provides some reflections on future developments of Probabilistic Safety Assessment (PSA) in view of the present state of the art and evaluates current trends in the use of PSA for safety management. The main emphasis is on Level 1 PSA, although Level 2 aspects are also highlighted to some extent. As a starting point, the role of PSA is outlined from a historical perspective, demonstrating the rapid expansion of the uses of PSA. In this context the wide spectrum of PSA applications and the associated benefits to the users are in focus. It should be kept in mind, however, that PSA, in spite of its merits, is not a self-standing safety tool. It complements deterministic analysis and thus improves understanding and facilitating prioritization of safety issues. Significant progress in handling PSA limitations - such as reliability data, common-cause failures, human interactions, external events, accident progression, containment performance, and source-term issues - is described. This forms a background for expected future developments of PSA. Among the most important issues on the agenda for the future are PSA scope extensions, methodological improvements and computer code advancements, and full exploitation of the potential benefits of applications to operational safety management. Many PSA uses, if properly exercised, lead to safety improvements as well as major burden reductions. The article provides, in addition, International Atomic Energy Agency (IAEA) perspective on the topics covered, as reflected in the current PSA programs of the agency. 74 refs., 6 figs., 1 tab

  5. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    International Nuclear Information System (INIS)

    Rucker, D.F.

    2000-01-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have been overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration

  6. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    Energy Technology Data Exchange (ETDEWEB)

    Rucker, D.F.

    2000-09-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have been overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived

  7. When probabilistic seismic hazard climbs volcanoes: the Mt. Etna case, Italy - Part 1: Model components for sources parameterization

    Science.gov (United States)

    Azzaro, Raffaele; Barberi, Graziella; D'Amico, Salvatore; Pace, Bruno; Peruzza, Laura; Tuvè, Tiziana

    2017-11-01

    The volcanic region of Mt. Etna (Sicily, Italy) represents a perfect lab for testing innovative approaches to seismic hazard assessment. This is largely due to the long record of historical and recent observations of seismic and tectonic phenomena, the high quality of various geophysical monitoring and particularly the rapid geodynamics clearly demonstrate some seismotectonic processes. We present here the model components and the procedures adopted for defining seismic sources to be used in a new generation of probabilistic seismic hazard assessment (PSHA), the first results and maps of which are presented in a companion paper, Peruzza et al. (2017). The sources include, with increasing complexity, seismic zones, individual faults and gridded point sources that are obtained by integrating geological field data with long and short earthquake datasets (the historical macroseismic catalogue, which covers about 3 centuries, and a high-quality instrumental location database for the last decades). The analysis of the frequency-magnitude distribution identifies two main fault systems within the volcanic complex featuring different seismic rates that are controlled essentially by volcano-tectonic processes. We discuss the variability of the mean occurrence times of major earthquakes along the main Etnean faults by using an historical approach and a purely geologic method. We derive a magnitude-size scaling relationship specifically for this volcanic area, which has been implemented into a recently developed software tool - FiSH (Pace et al., 2016) - that we use to calculate the characteristic magnitudes and the related mean recurrence times expected for each fault. Results suggest that for the Mt. Etna area, the traditional assumptions of uniform and Poissonian seismicity can be relaxed; a time-dependent fault-based modeling, joined with a 3-D imaging of volcano-tectonic sources depicted by the recent instrumental seismicity, can therefore be implemented in PSHA maps

  8. Ecohydrology of agroecosystems: probabilistic description of yield reduction risk under limited water availability

    Science.gov (United States)

    Vico, Giulia; Porporato, Amilcare

    2013-04-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and

  9. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  10. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  11. A Study on Improvement of Algorithm for Source Term Evaluation

    International Nuclear Information System (INIS)

    Park, Jeong Ho; Park, Do Hyung; Lee, Jae Hee

    2010-03-01

    The program developed by KAERI for source term assessment of radwastes from the advanced nuclear fuel cycle consists of spent fuel database analysis module, spent fuel arising projection module, and automatic characterization module for radwastes from pyroprocess. To improve the algorithm adopted the developed program, following items were carried out: - development of an algorithm to decrease analysis time for spent fuel database - development of setup routine for a analysis procedure - improvement of interface for spent fuel arising projection module - optimization of data management algorithm needed for massive calculation to estimate source terms of radwastes from advanced fuel cycle The program developed through this study has a capability to perform source term estimation although several spent fuel assemblies with different fuel design, initial enrichment, irradiation history, discharge burnup, and cooling time are processed at the same time in the pyroprocess. It is expected that this program will be very useful for the design of unit process of pyroprocess and disposal system

  12. Conceptual model for deriving the repository source term

    International Nuclear Information System (INIS)

    Alexander, D.H.; Apted, M.J.; Liebetrau, A.M.; Doctor, P.G.; Williford, R.E.; Van Luik, A.E.

    1984-11-01

    Part of a strategy for evaluating the compliance of geologic repositories with federal regulations is a modeling approach that would provide realistic release estimates for a particular configuration of the engineered-barrier system. The objective is to avoid worst-case bounding assumptions that are physically impossible or excessively conservative and to obtain probabilistic estimates of (1) the penetration time for metal barriers and (2) radionuclide-release rates for individually simulated waste packages after penetration has occurred. The conceptual model described in this paper will assume that release rates are explicitly related to such time-dependent processes as mass transfer, dissolution and precipitation, radionuclide decay, and variations in the geochemical environment. The conceptual model will take into account the reduction in the rates of waste-form dissolution and metal corrosion due to a buildup of chemical reaction products. The sorptive properties of the metal-barrier corrosion products in proximity to the waste form surface will also be included. Cumulative releases from the engineered-barrier system will be calculated by summing the releases from a probabilistically generated population of individual waste packages. 14 refs., 7 figs

  13. Probabilistic Forecasting of the Wave Energy Flux

    DEFF Research Database (Denmark)

    Pinson, Pierre; Reikard, G.; Bidlot, J.-R.

    2012-01-01

    Wave energy will certainly have a significant role to play in the deployment of renewable energy generation capacities. As with wind and solar, probabilistic forecasts of wave power over horizons of a few hours to a few days are required for power system operation as well as trading in electricit......% and 70% in terms of Continuous Rank Probability Score (CRPS), depending upon the test case and the lead time. It is finally shown that the log-Normal assumption can be seen as acceptable, even though it may be refined in the future....

  14. Source terms in relation to air cleaning

    International Nuclear Information System (INIS)

    Bernero, R.M.

    1985-01-01

    There are two sets of source terms for consideration in air cleaning, those for routine releases and those for accident releases. With about 1000 reactor years of commercial operating experience in the US done, there is an excellent data base for routine and expected transient releases. Specifications for air cleaning can be based on this body of experience with confidence. Specifications for air cleaning in accident situations is another matter. Recent investigations of severe accident behavior are offering a new basis for source terms and air cleaning specifications. Reports by many experts in the field describe an accident environment notably different from previous models. It is an atmosphere heavy with aerosols, both radioactive and inert. Temperatures are sometimes very high; radioiodine is typically in the form of cesium iodide aerosol particles; other nuclides, such as tellurium, are also important aerosols. Some of the present air cleaning requirements may be very important in light of these new accident behavior models. Others may be wasteful or even counterproductive. The use of the new data on accident behavior models to reevaluate requirements promptly is discussed

  15. Source-term reevaluation for US commercial nuclear power reactors: a status report

    International Nuclear Information System (INIS)

    Herzenberg, C.L.; Ball, J.R.; Ramaswami, D.

    1984-12-01

    Only results that had been discussed publicly, had been published in the open literature, or were available in preliminary reports as of September 30, 1984, are included here. More than 20 organizations are participating in source-term programs, which have been undertaken to examine severe accident phenomena in light-water power reactors (including the chemical and physical behavior of fission products under accident conditions), update and reevaluate source terms, and resolve differences between predictions and observations of radiation releases and related phenomena. Results from these source-term activities have been documented in over 100 publications to date

  16. Probabilistic risk analysis and fault trees: Initial discussion of application to identification of risk at a wellhead

    Science.gov (United States)

    Rodak, C.; Silliman, S.

    2012-02-01

    Wellhead protection is of critical importance for managing groundwater resources. While a number of previous authors have addressed questions related to uncertainties in advective capture zones, methods for addressing wellhead protection in the presence of uncertainty in the chemistry of groundwater contaminants, the relationship between land-use and contaminant sources, and the impact on health of the receiving population are limited. It is herein suggested that probabilistic risk analysis (PRA) combined with fault trees (FT) provides a structure whereby chemical transport can be combined with uncertainties in source, chemistry, and health impact to assess the probability of negative health outcomes in the population. As such, PRA-FT provides a new strategy for the identification of areas of probabilistically high human health risk. Application of this approach is demonstrated through a simplified case study involving flow to a well in an unconfined aquifer with heterogeneity in aquifer properties and contaminant sources.

  17. Bayesian networks for identifying incorrect probabilistic intuitions in a climate trend uncertainty quantification context

    NARCIS (Netherlands)

    Hanea, A.M.; Nane, G.F.; Wielicki, B.A.; Cooke, R.M.

    2018-01-01

    Probabilistic thinking can often be unintuitive. This is the case even for simple problems, let alone the more complex ones arising in climate modelling, where disparate information sources need to be combined. The physical models, the natural variability of systems, the measurement errors and

  18. A Probabilistic Palimpsest Model of Visual Short-term Memory

    Science.gov (United States)

    Matthey, Loic; Bays, Paul M.; Dayan, Peter

    2015-01-01

    Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ. PMID:25611204

  19. Estimating the number of sources in a noisy convolutive mixture using BIC

    DEFF Research Database (Denmark)

    Olsson, Rasmus Kongsgaard; Hansen, Lars Kai

    2004-01-01

    The number of source signals in a noisy convolutive mixture is determined based on the exact log-likelihoods of the candidate models. In (Olsson and Hansen, 2004), a novel probabilistic blind source separator was introduced that is based solely on the time-varying second-order statistics of the s......The number of source signals in a noisy convolutive mixture is determined based on the exact log-likelihoods of the candidate models. In (Olsson and Hansen, 2004), a novel probabilistic blind source separator was introduced that is based solely on the time-varying second-order statistics...

  20. Analytical incorporation of fractionation effects in probabilistic treatment planning for intensity-modulated proton therapy.

    Science.gov (United States)

    Wahl, Niklas; Hennig, Philipp; Wieser, Hans-Peter; Bangert, Mark

    2018-04-01

    We show that it is possible to explicitly incorporate fractionation effects into closed-form probabilistic treatment plan analysis and optimization for intensity-modulated proton therapy with analytical probabilistic modeling (APM). We study the impact of different fractionation schemes on the dosimetric uncertainty induced by random and systematic sources of range and setup uncertainty for treatment plans that were optimized with and without consideration of the number of treatment fractions. The APM framework is capable of handling arbitrarily correlated uncertainty models including systematic and random errors in the context of fractionation. On this basis, we construct an analytical dose variance computation pipeline that explicitly considers the number of treatment fractions for uncertainty quantitation and minimization during treatment planning. We evaluate the variance computation model in comparison to random sampling of 100 treatments for conventional and probabilistic treatment plans under different fractionation schemes (1, 5, 30 fractions) for an intracranial, a paraspinal and a prostate case. The impact of neglecting the fractionation scheme during treatment planning is investigated by applying treatment plans that were generated with probabilistic optimization for 1 fraction in a higher number of fractions and comparing them to the probabilistic plans optimized under explicit consideration of the number of fractions. APM enables the construction of an analytical variance computation model for dose uncertainty considering fractionation at negligible computational overhead. It is computationally feasible (a) to simultaneously perform a robustness analysis for all possible fraction numbers and (b) to perform a probabilistic treatment plan optimization for a specific fraction number. The incorporation of fractionation assumptions for robustness analysis exposes a dose to uncertainty trade-off, i.e., the dose in the organs at risk is increased for a

  1. Accident source terms for Light-Water Nuclear Power Plants. Final report

    International Nuclear Information System (INIS)

    Soffer, L.; Burson, S.B.; Ferrell, C.M.; Lee, R.Y.; Ridgely, J.N.

    1995-02-01

    In 1962 tile US Atomic Energy Commission published TID-14844, ''Calculation of Distance Factors for Power and Test Reactors'' which specified a release of fission products from the core to the reactor containment for a postulated accident involving ''substantial meltdown of the core''. This ''source term'', tile basis for tile NRC's Regulatory Guides 1.3 and 1.4, has been used to determine compliance with tile NRC's reactor site criteria, 10 CFR Part 100, and to evaluate other important plant performance requirements. During the past 30 years substantial additional information on fission product releases has been developed based on significant severe accident research. This document utilizes this research by providing more realistic estimates of the ''source term'' release into containment, in terms of timing, nuclide types, quantities and chemical form, given a severe core-melt accident. This revised ''source term'' is to be applied to the design of future light water reactors (LWRs). Current LWR licensees may voluntarily propose applications based upon it

  2. Reassessment of the technical bases for estimating source terms. Draft report for comment

    International Nuclear Information System (INIS)

    Silberberg, M.; Mitchell, J.A.; Meyer, R.O.; Pasedag, W.F.; Ryder, C.P.; Peabody, C.A.; Jankowski, M.W.

    1985-07-01

    NUREG-0956 describes the NRC staff and contractor efforts to reassess and update the agency's analytical procedures for estimating accident source terms for nuclear power plants. The effort included development of a new source term analytical procedure - a set of computer codes - that is intended to replace the methodology of the Reactor Safety Study (WASH-1400) and to be used in reassessing the use of TID-14844 assumptions (10 CFR 100). NUREG-0956 describes the development of these codes, the demonstration of the codes to calculate source terms for specific cases, the peer review of this work, some perspectives on the overall impact of new source terms on plant risks, the plans for related research projects, and the conclusions and recommendations resulting from the effort

  3. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  4. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  5. Information Uncertainty in Electricity Markets: Introducing Probabilistic Offers

    DEFF Research Database (Denmark)

    Papakonstantinou, Athanasios; Pinson, Pierre

    2016-01-01

    We propose a shift from the current paradigm of electricity markets treating stochastic producers similarly to conventional ones in terms of their offers. We argue that the producers’ offers should be probabilistic to reflect the limited predictability of renewable energy generation, while we...... should design market mechanisms to accommodate such offers. We argue that the transition from deterministic offers is a natural next step in electricity markets, by analytically proving our proposal’s equivalence with a two-price conventional market....

  6. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  7. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    Science.gov (United States)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  8. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  9. Pennsylvania Source Term Tracking System

    International Nuclear Information System (INIS)

    1992-08-01

    The Pennsylvania Source Term Tracking System tabulates surveys received from radioactive waste generators in the Commonwealth of radioactive waste is collected each quarter from generators using the Low-Level Radioactive Waste Management Quarterly Report Form (hereafter called the survey) and then entered into the tracking system data base. This personal computer-based tracking system can generate 12 types of tracking reports. The first four sections of this reference manual supply complete instructions for installing and setting up the tracking system on a PC. Section 5 presents instructions for entering quarterly survey data, and Section 6 discusses generating reports. The appendix includes samples of each report

  10. A mediation model to explain decision making under conditions of risk among adolescents: the role of fluid intelligence and probabilistic reasoning.

    Science.gov (United States)

    Donati, Maria Anna; Panno, Angelo; Chiesi, Francesca; Primi, Caterina

    2014-01-01

    This study tested the mediating role of probabilistic reasoning ability in the relationship between fluid intelligence and advantageous decision making among adolescents in explicit situations of risk--that is, in contexts in which information on the choice options (gains, losses, and probabilities) were explicitly presented at the beginning of the task. Participants were 282 adolescents attending high school (77% males, mean age = 17.3 years). We first measured fluid intelligence and probabilistic reasoning ability. Then, to measure decision making under explicit conditions of risk, participants performed the Game of Dice Task, in which they have to decide among different alternatives that are explicitly linked to a specific amount of gain or loss and have obvious winning probabilities that are stable over time. Analyses showed a significant positive indirect effect of fluid intelligence on advantageous decision making through probabilistic reasoning ability that acted as a mediator. Specifically, fluid intelligence may enhance ability to reason in probabilistic terms, which in turn increases the likelihood of advantageous choices when adolescents are confronted with an explicit decisional context. Findings show that in experimental paradigm settings, adolescents are able to make advantageous decisions using cognitive abilities when faced with decisions under explicit risky conditions. This study suggests that interventions designed to promote probabilistic reasoning, for example by incrementing the mathematical prerequisites necessary to reason in probabilistic terms, may have a positive effect on adolescents' decision-making abilities.

  11. Deterministic and Probabilistic Serviceability Assessment of Footbridge Vibrations due to a Single Walker Crossing

    Directory of Open Access Journals (Sweden)

    Cristoforo Demartino

    2018-01-01

    Full Text Available This paper presents a numerical study on the deterministic and probabilistic serviceability assessment of footbridge vibrations due to a single walker crossing. The dynamic response of the footbridge is analyzed by means of modal analysis, considering only the first lateral and vertical modes. Single span footbridges with uniform mass distribution are considered, with different values of the span length, natural frequencies, mass, and structural damping and with different support conditions. The load induced by a single walker crossing the footbridge is modeled as a moving sinusoidal force either in the lateral or in the vertical direction. The variability of the characteristics of the load induced by walkers is modeled using probability distributions taken from the literature defining a Standard Population of walkers. Deterministic and probabilistic approaches were adopted to assess the peak response. Based on the results of the simulations, deterministic and probabilistic vibration serviceability assessment methods are proposed, not requiring numerical analyses. Finally, an example of the application of the proposed method to a truss steel footbridge is presented. The results highlight the advantages of the probabilistic procedure in terms of reliability quantification.

  12. Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.

    Science.gov (United States)

    Lau, Jey Han; Clark, Alexander; Lappin, Shalom

    2017-07-01

    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.

  13. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  14. Source term model evaluations for the low-level waste facility performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Yim, M.S.; Su, S.I. [North Carolina State Univ., Raleigh, NC (United States)

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  15. An investigation of the closure problem applied to reactor accident source terms

    International Nuclear Information System (INIS)

    Brearley, I.R.; Nixon, W.; Hayns, M.R.

    1987-01-01

    The closure problem, as considered here, focuses attention on the question of when in current research programmes enough has been learned about the source terms for reactor accident releases. Noting that current research is tending to reduce the estimated magnitude of the aerosol component of atmospheric, accidental releases, several possible criteria for closure are suggested. Moreover, using the reactor accident consequence model CRACUK, the effect of gradually reducing the aerosol release fractions of a pressurized water reactor (PWR2) source term (as defined in the WASH-1400 study) is investigated and the implications of applying the suggested criteria to current source term research discussed. (author)

  16. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  17. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  18. Guided SAR image despeckling with probabilistic non local weights

    Science.gov (United States)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  19. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  20. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  1. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  2. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  3. Probabilistic Modelling of Information Propagation in Wireless Mobile Ad-Hoc Network

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Hansen, Martin Bøgsted; Schwefel, Hans-Peter

    2005-01-01

    In this paper the dynamics of broadcasting wireless ad-hoc networks is studied through probabilistic modelling. A randomized transmission discipline is assumed in accordance with existing MAC definitions such as WLAN with Decentralized Coordination or IEEE-802.15.4. Message reception is assumed...... to be governed by node power-down policies and is equivalently assumed to be randomized. Altogether randomization facilitates a probabilistic model in the shape of an integro-differential equation governing the propagation of information, where brownian node mobility may be accounted for by including an extra...... diffusion term. The established model is analyzed for transient behaviour and a travelling wave solution facilitates expressions for propagation speed as well as parametrized analysis of network reliability and node power consumption. Applications of the developed models for node localization and network...

  4. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  5. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  6. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  7. The Safety Assessment of OPR-1000 for Station Blackout Applying Combined Deterministic and Probabilistic Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dong Gu; Ahn, Seung-Hoon; Cho, Dae-Hyung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    This is termed station blackout (SBO). However, it does not generally include the loss of available AC power to safety buses fed by station batteries through inverters or by alternate AC sources. Historically, risk analysis results have indicated that SBO was a significant contributor to overall core damage frequency. In this study, the safety assessment of OPR-1000 nuclear power plant for SBO accident, which is a typical beyond design basis accident and important contributor to overall plant risk, is performed by applying the combined deterministic and probabilistic procedure (CDPP). In addition, discussions are made for reevaluation of SBO risk at OPR-1000 by eliminating excessive conservatism in existing PSA. The safety assessment of OPR-1000 for SBO accident, which is a typical BDBA and significant contributor to overall plant risk, was performed by applying the combined deterministic and probabilistic procedure. However, the reference analysis showed that the CDF and CCDP did not meet the acceptable risk, and it was confirmed that the SBO risk should be reevaluated. By estimating the offsite power restoration time appropriately, the SBO risk was reevaluated, and it was finally confirmed that current OPR-1000 system lies in the acceptable risk against the SBO. In addition, it was demonstrated that the proposed CDPP is applicable to safety assessment of BDBAs in nuclear power plants without significant erosion of the safety margin.

  8. Data assimilation and source term estimation during the early phase of a nuclear accident

    Energy Technology Data Exchange (ETDEWEB)

    Golubenkov, A.; Borodin, R. [SPA Typhoon, Emergency Centre (Russian Federation); Sohier, A.; Rojas Palma, C. [Centre de l`Etude de l`Energie Nucleaire, Mol (Belgium)

    1996-02-01

    The mathematical/physical base of possible methods to model the source term during an accidental release of radionuclides is discussed. Knowledge of the source term is important in view of optimizing urgent countermeasures to the population. In most cases however, it will be impossible to assess directly the release dynamics. Therefore methods are under development in which the source term is modelled, based on the comparison of off-site monitoring data and model predictions using an atmospheric dispersion model. The degree of agreement between the measured and calculated characteristics of the radioactive contamination of the air and the ground surface is an important criterion in this process. Due to the inherent complexity, some geometrical transformations taking space-time discrepancies between observed and modelled contamination fields are defined before the source term is adapted. This work describes the developed algorithms which are also tested against data from some tracer experiments performed in the past. This method is also used to reconstruct the dynamics of the Chernobyl source term. Finally this report presents a concept of software to reconstruct a multi-isotopic source term in real-time.

  9. Data assimilation and source term estimation during the early phase of a nuclear accident

    International Nuclear Information System (INIS)

    Golubenkov, A.; Borodin, R.; Sohier, A.; Rojas Palma, C.

    1996-02-01

    The mathematical/physical base of possible methods to model the source term during an accidental release of radionuclides is discussed. Knowledge of the source term is important in view of optimizing urgent countermeasures to the population. In most cases however, it will be impossible to assess directly the release dynamics. Therefore methods are under development in which the source term is modelled, based on the comparison of off-site monitoring data and model predictions using an atmospheric dispersion model. The degree of agreement between the measured and calculated characteristics of the radioactive contamination of the air and the ground surface is an important criterion in this process. Due to the inherent complexity, some geometrical transformations taking space-time discrepancies between observed and modelled contamination fields are defined before the source term is adapted. This work describes the developed algorithms which are also tested against data from some tracer experiments performed in the past. This method is also used to reconstruct the dynamics of the Chernobyl source term. Finally this report presents a concept of software to reconstruct a multi-isotopic source term in real-time

  10. Revised reactor accident source terms in the U.S. and implementation for light water reactors

    International Nuclear Information System (INIS)

    Soffer, L.; Lee, J.Y.

    1992-01-01

    Current NRC reactor accident source terms used for licensing are contained in Regulatory Guides 1.3 and 1.4 and specify that 100 % of the core inventory of noble gases and 25 % of the iodine fission products are assumed to be instantaneously available for release from the containment. The chemical form of the iodine fission products is also assumed to be predominantly elemental (I 2 ) iodine. These assumptions have strongly affected present nuclear plant designs. Severe accident research results have confirmed that although the current source term is very substantial and has resulted in a very high level of plant capability, the present source term is no longer compatible with a realistic understanding of severe accidents. The NRC has issued a proposed revision of the reactor accident source terms as part of several regulatory activities to incorporate severe accident insights for future plants. A revision to 10 CFR 100 is also being proposed to specify site criteria directly and to eliminate source terms and doses for site evaluation. Reactor source terms will continue to be important in evaluating plant designs. Although intended primarily for future plants, existing and evolutionary power plants may voluntarily apply revised accident source term insights as well in licensing. The proposed revised accident source terms are presented in terms of fission product composition, magnitude, timing and iodine chemical form. Some implications for light water reactors are discussed. (author)

  11. Recent case studies and advancements in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Garrick, B.J.

    1985-01-01

    During the period from 1977 to 1984, Pickard, Lowe and Garrick, Inc., had the lead in preparing several full scope probabilistic risk assessments for electric utilities. Five of those studies are discussed from the point of view of advancements and lessons learned. The objective and trend of these studies is toward utilization of the risk models by the plant owners as risk management tools. Advancements that have been made are in presentation ad documentation of the PRAs, generation of more understandable plant level information, and improvements in methodology to facilitate technology transfer. Specific areas of advancement are in the treatment of such issues as dependent failures, human interaction, and the uncertainty in the source term. Lessons learned cover a wide spectrum and include the importance of plant specific models for meaningful risk management, the role of external events in risk, the sensitivity of contributors to choice of risk index, and the very important finding that the public risk is extremely small. The future direction of PRA is to establish less dependence on experts for in-plant application. Computerizing the PRAs such that they can be accessed on line and interactively is the key

  12. Source mechanisms of volcanic tsunamis.

    Science.gov (United States)

    Paris, Raphaël

    2015-10-28

    Volcanic tsunamis are generated by a variety of mechanisms, including volcano-tectonic earthquakes, slope instabilities, pyroclastic flows, underwater explosions, shock waves and caldera collapse. In this review, we focus on the lessons that can be learnt from past events and address the influence of parameters such as volume flux of mass flows, explosion energy or duration of caldera collapse on tsunami generation. The diversity of waves in terms of amplitude, period, form, dispersion, etc. poses difficulties for integration and harmonization of sources to be used for numerical models and probabilistic tsunami hazard maps. In many cases, monitoring and warning of volcanic tsunamis remain challenging (further technical and scientific developments being necessary) and must be coupled with policies of population preparedness. © 2015 The Author(s).

  13. EDF source term reduction project main outcomes and further developments

    International Nuclear Information System (INIS)

    Ranchoux, Gilles; Bonnefon, Julien; Benfarah, Moez; Wintergerst Matthieu; Gressier, Frederic; Leclercq, Stephanie

    2012-09-01

    The dose reduction is a strategic purpose for EDF in link with the stakes of, nuclear acceptability, respect of regulation and productivity gains. This consists not only in improving the reactor shutdown organization (time spent in control area, biological shielding,...) but also in improving the radiological state of the unit and the efficiency of the source term reduction operations. Since 2003, EDF has been running an innovative project called 'Source Term Reduction' federating the different EDF research and engineering centers in order to: - participate to the long term view about Radiological Protection issues (international feedback analyses), - develop contamination prediction tools (OSCAR software) suitable for the industrial needs (operating units and EPR design), - develop scientific models useful for the understanding of contamination mechanisms to support the strategic decision processes, - carry on with updating and analyzing of contamination measurements feedback in corrosion products (EMECC and CZT campaigns), - carry on with the operational support at short or middle term by optimizing startup and shutdown processes, pre-oxidation or and by improving purification efficiency or material characteristics. This paper will show in a first part the main 2011 results in occupational exposure (collective and individual dose, RCS index...). In a second part, an overview of the main EDF outcomes of the last 3 years in the field of source term reduction will be presented. Future developments extended to contamination issues in EDF NPPs will be also pointed out in this paper. (authors)

  14. Probabilistic Simulation of Multi-Scale Composite Behavior

    Science.gov (United States)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  15. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  16. Backup Sourcing Decisions for Coping with Supply Disruptions under Long-Term Horizons

    Directory of Open Access Journals (Sweden)

    Jing Hou

    2016-01-01

    Full Text Available This paper studies a buyer’s inventory control problem under a long-term horizon. The buyer has one major supplier that is prone to disruption risks and one backup supplier with higher wholesale price. Two kinds of sourcing methods are available for the buyer: single sourcing with/without contingent supply and dual sourcing. In contingent sourcing, the backup supplier is capacitated and/or has yield uncertainty, whereas in dual sourcing the backup supplier has an incentive to offer output flexibility during disrupted periods. The buyer’s expected cost functions and the optimal base-stock levels using each sourcing method under long-term horizon are obtained, respectively. The effects of three risk parameters, disruption probability, contingent capacity or uncertainty, and backup flexibility, are examined using comparative studies and numerical computations. Four sourcing methods, namely, single sourcing with contingent supply, dual sourcing, and single sourcing from either of the two suppliers, are also compared. These findings can be used as a valuable guideline for companies to select an appropriate sourcing strategy under supply disruption risks.

  17. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  18. Reevaluation of HFIR source term: Supplement 2

    International Nuclear Information System (INIS)

    Thomas, W.E.

    1986-11-01

    The HFIR source term has been reevaluated to assess the impact of the increase in core lifetime from 15 to 24 days. Calculations were made to determine the nuclide activities of the iodines, noble gases, and other fission products. The results show that there is no significant change in off-site dose due to the increased fuel cycle for the release scenario postulated in ORNL-3573

  19. Use of probabilistic safety analyses in severe accident management

    International Nuclear Information System (INIS)

    Neogy, P.; Lehner, J.

    1991-01-01

    An important consideration in the development and assessment of severe accident management strategies is that while the strategies are often built on the knowledge base of Probabilistic Safety Analyses (PSA), they must be interpretable and meaningful in terms of the control room indicators. In the following, the relationships between PSA and severe accident management are explored using ex-vessel accident management at a PWR ice-condenser plant as an example. 2 refs., 1 fig., 3 tabs

  20. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  1. Failure analysis of the cement mantle in total hip arthroplasty with an efficient probabilistic method.

    Science.gov (United States)

    Kaymaz, Irfan; Bayrak, Ozgu; Karsan, Orhan; Celik, Ayhan; Alsaran, Akgun

    2014-04-01

    Accurate prediction of long-term behaviour of cemented hip implants is very important not only for patient comfort but also for elimination of any revision operation due to failure of implants. Therefore, a more realistic computer model was generated and then used for both deterministic and probabilistic analyses of the hip implant in this study. The deterministic failure analysis was carried out for the most common failure states of the cement mantle. On the other hand, most of the design parameters of the cemented hip are inherently uncertain quantities. Therefore, the probabilistic failure analysis was also carried out considering the fatigue failure of the cement mantle since it is the most critical failure state. However, the probabilistic analysis generally requires large amount of time; thus, a response surface method proposed in this study was used to reduce the computation time for the analysis of the cemented hip implant. The results demonstrate that using an efficient probabilistic approach can significantly reduce the computation time for the failure probability of the cement from several hours to minutes. The results also show that even the deterministic failure analyses do not indicate any failure of the cement mantle with high safety factors, the probabilistic analysis predicts the failure probability of the cement mantle as 8%, which must be considered during the evaluation of the success of the cemented hip implants.

  2. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  3. LS-APC v1.0: a tuning-free method for the linear inverse problem and its application to source-term determination

    Directory of Open Access Journals (Sweden)

    O. Tichý

    2016-11-01

    Full Text Available Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values as a product of the source-receptor sensitivity (SRS matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.

  4. LS-APC v1.0: a tuning-free method for the linear inverse problem and its application to source-term determination

    Science.gov (United States)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Stohl, Andreas

    2016-11-01

    Estimation of pollutant releases into the atmosphere is an important problem in the environmental sciences. It is typically formalized as an inverse problem using a linear model that can explain observable quantities (e.g., concentrations or deposition values) as a product of the source-receptor sensitivity (SRS) matrix obtained from an atmospheric transport model multiplied by the unknown source-term vector. Since this problem is typically ill-posed, current state-of-the-art methods are based on regularization of the problem and solution of a formulated optimization problem. This procedure depends on manual settings of uncertainties that are often very poorly quantified, effectively making them tuning parameters. We formulate a probabilistic model, that has the same maximum likelihood solution as the conventional method using pre-specified uncertainties. Replacement of the maximum likelihood solution by full Bayesian estimation also allows estimation of all tuning parameters from the measurements. The estimation procedure is based on the variational Bayes approximation which is evaluated by an iterative algorithm. The resulting method is thus very similar to the conventional approach, but with the possibility to also estimate all tuning parameters from the observations. The proposed algorithm is tested and compared with the standard methods on data from the European Tracer Experiment (ETEX) where advantages of the new method are demonstrated. A MATLAB implementation of the proposed algorithm is available for download.

  5. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  6. Consideration of emergency source terms for pebble-bed high temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Tao, Liu; Jun, Zhao; Jiejuan, Tong; Jianzhu, Cao

    2009-01-01

    Being the last barrier in the nuclear power plant defense-in-depth strategy, emergency planning (EP) is an integrated project. One of the key elements in this process is emergency source terms selection. Emergency Source terms for light water reactor (LWR) nuclear power plant (NPP) have been introduced in many technical documents, and advanced NPP emergency planning is attracting attention recently. Commercial practices of advanced NPP are undergoing in the world, pebble-bed high-temperature gas-cooled reactor (HTGR) power plant is under construction in China which is considered as a representative of advanced NPP. The paper tries to find some pieces of suggestion from our investigation. The discussion of advanced NPP EP will be summarized first, and then the characteristics of pebble-bed HTGR relating to EP will be described. Finally, PSA insights on emergency source terms selection and current pebble-bed HTGR emergency source terms suggestions are proposed

  7. Evaluation of the LMFBR cover gas source term and synthesis of the associated R and D

    International Nuclear Information System (INIS)

    Balard, F.; Carluec, B.

    1996-01-01

    At the end of the seventies and the beginning of the eighties, there appeared a pressing need of experimental results to assess the LMFBR's safety level. Because of the urgency, analytical studies were not systematically undertaken and maximum credible cover gas instantaneous source terms (radionuclides core release fraction) were got directly from crude out-of-pile experiment interpretations. Two types of studies and mock-ups were undertaken depending on the timescale of the phenomena: instantaneous source terms (corresponding to an unlikely energetic core disruptive accident CDA), and delayed ones (tens of minutes to some hours). The experiments performed in this frame are reviewed in this presentation: 1) instantaneous source term: - FAUST experiments: I, Cs, UO2 source terms (FzK, Germany), - FAST experiments : pool depth influence on non volatile source term (USA), - CARAVELLE experiments: nonvolatile source term in SPX1 geometry (CEA, France); 2) delayed source term: - NALA experiments: I, Cs, Sr, UO2 source term (FzK, Germany), - PAVE experiments: I source term (CEA, France), - NACOWA experiments: cover gas aerosols enrichment in I and Cs (FzK, Germany) - other French experiments in COPACABANA and GULLIVER facilities. The volatile fission products release is tightly bound to sodium evaporation and a large part of the fission products is dissolved in the liquid sodium aerosols present in the cover gas. Thus the knowledge of the amount of aerosol release to the cover gas is important for the evaluation of the source term. The maximum credible cover gas instantaneous source terms deduced from the experiments have led to conservative source terms to be taken into account in safety analysis. Nevertheless modelling attempts of the observed (in-pile or out-of-pile) physico-chemical phenomena have been undertaken for extrapolation to the reactor case. The main topics of this theoretical research are as follows: fission products evaporation in the cover gas (Fz

  8. Probabilistic wind power forecasting based on logarithmic transformation and boundary kernel

    International Nuclear Information System (INIS)

    Zhang, Yao; Wang, Jianxue; Luo, Xu

    2015-01-01

    Highlights: • Quantitative information on the uncertainty of wind power generation. • Kernel density estimator provides non-Gaussian predictive distributions. • Logarithmic transformation reduces the skewness of wind power density. • Boundary kernel method eliminates the density leakage near the boundary. - Abstracts: Probabilistic wind power forecasting not only produces the expectation of wind power output, but also gives quantitative information on the associated uncertainty, which is essential for making better decisions about power system and market operations with the increasing penetration of wind power generation. This paper presents a novel kernel density estimator for probabilistic wind power forecasting, addressing two characteristics of wind power which have adverse impacts on the forecast accuracy, namely, the heavily skewed and double-bounded nature of wind power density. Logarithmic transformation is used to reduce the skewness of wind power density, which improves the effectiveness of the kernel density estimator in a transformed scale. Transformations partially relieve the boundary effect problem of the kernel density estimator caused by the double-bounded nature of wind power density. However, the case study shows that there are still some serious problems of density leakage after the transformation. In order to solve this problem in the transformed scale, a boundary kernel method is employed to eliminate the density leak at the bounds of wind power distribution. The improvement of the proposed method over the standard kernel density estimator is demonstrated by short-term probabilistic forecasting results based on the data from an actual wind farm. Then, a detailed comparison is carried out of the proposed method and some existing probabilistic forecasting methods

  9. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  10. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  11. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  12. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    Energy Technology Data Exchange (ETDEWEB)

    Frederick, Jennifer M

    2018-03-01

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  13. An empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  14. Low-level waste disposal performance assessments - Total source-term analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wilhite, E.L.

    1995-12-31

    Disposal of low-level radioactive waste at Department of Energy (DOE) facilities is regulated by DOE. DOE Order 5820.2A establishes policies, guidelines, and minimum requirements for managing radioactive waste. Requirements for disposal of low-level waste emplaced after September 1988 include providing reasonable assurance of meeting stated performance objectives by completing a radiological performance assessment. Recently, the Defense Nuclear Facilities Safety Board issued Recommendation 94-2, {open_quotes}Conformance with Safety Standards at Department of Energy Low-Level Nuclear Waste and Disposal Sites.{close_quotes} One of the elements of the recommendation is that low-level waste performance assessments do not include the entire source term because low-level waste emplaced prior to September 1988, as well as other DOE sources of radioactivity in the ground, are excluded. DOE has developed and issued guidance for preliminary assessments of the impact of including the total source term in performance assessments. This paper will present issues resulting from the inclusion of all DOE sources of radioactivity in performance assessments of low-level waste disposal facilities.

  15. Engineering aspects of probabilistic risk assessment

    International Nuclear Information System (INIS)

    vonHerrmann, J.L.; Wood, P.J.

    1984-01-01

    Over the last decade, the use of probabilistic risk assessment (PRA) in the nuclear industry has expanded significantly. In these analyses the probabilities of experiencing certain undesired events (for example, a plant accident which results in damage to the nuclear fuel) are estimated and the consequences of these events are evaluated in terms of some common measure. These probabilities and consequences are then combined to form a representation of the risk associated with the plant studied. In the relatively short history of probabilistic risk assessment of nuclear power plants, the primary motivation for these studies has been the quantitative assessment of public risk associated with a single plant or group of plants. Accordingly, the primary product of most PRAs performed to date has been a 'risk curve' in which the probability (or expected frequency) of exceeding a certain consequence level is plotted against that consequence. The most common goal of these assessments has been to demonstrate the 'acceptability' of the calculated risk by comparison of the resultant risk curve to risk curves associated with other plants or with other societal risks. Presented here are brief descriptions of some alternate applications of PRAs, a discussion of how these other applications compare or contrast with the currently popular uses of PRA, and a discussion of the relative benefits of each

  16. PLOTLIB: a computerized nuclear waste source-term library storage and retrieval system

    International Nuclear Information System (INIS)

    Marshall, J.R.; Nowicki, J.A.

    1978-01-01

    The PLOTLIB code was written to provide computer access to the Nuclear Waste Source-Term Library for those users with little previous computer programming experience. The principles of user orientation, quick accessibility, and versatility were extensively employed in the development of the PLOTLIB code to accomplish this goal. The Nuclear Waste Source-Term Library consists of 16 ORIGEN computer runs incorporating a wide variety of differing light water reactor (LWR) fuel cycles and waste streams. The typical isotopic source-term data consist of information on watts, curies, grams, etc., all of which are compiled as a function of time after reactor discharge and unitized on a per metric ton heavy metal basis. The information retrieval code, PLOTLIB, is used to process source-term information requests into computer plots and/or user-specified output tables. This report will serve both as documentation of the current data library and as an operations manual for the PLOTLIB computer code. The accompanying input description, program listing, and sample problems make this code package an easily understood tool for the various nuclear waste studies under way at the Office of Waste Isolation

  17. Probabilistic Reversible Automata and Quantum Automata

    OpenAIRE

    Golovkins, Marats; Kravtsev, Maksim

    2002-01-01

    To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.

  18. Influence of wind energy forecast in deterministic and probabilistic sizing of reserves

    Energy Technology Data Exchange (ETDEWEB)

    Gil, A.; Torre, M. de la; Dominguez, T.; Rivas, R. [Red Electrica de Espana (REE), Madrid (Spain). Dept. Centro de Control Electrico

    2010-07-01

    One of the challenges in large-scale wind energy integration in electrical systems is coping with wind forecast uncertainties at the time of sizing generation reserves. These reserves must be sized large enough so that they don't compromise security of supply or the balance of the system, but economic efficiency must be also kept in mind. This paper describes two methods of sizing spinning reserves taking into account wind forecast uncertainties, deterministic using a probabilistic wind forecast and probabilistic using stochastic variables. The deterministic method calculates the spinning reserve needed by adding components each of them in order to overcome one single uncertainty: demand errors, the biggest thermal generation loss and wind forecast errors. The probabilistic method assumes that demand forecast errors, short-term thermal group unavailability and wind forecast errors are independent stochastic variables and calculates the probability density function of the three variables combined. These methods are being used in the case of the Spanish peninsular system, in which wind energy accounted for 14% of the total electrical energy produced in the year 2009 and is one of the systems in the world with the highest wind penetration levels. (orig.)

  19. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term - Trial Calculation

    International Nuclear Information System (INIS)

    Grabaskas, David

    2016-01-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  20. Low-level radioactive waste source terms for the 1992 integrated data base

    International Nuclear Information System (INIS)

    Loghry, S.L.; Kibbey, A.H.; Godbee, H.W.; Icenhour, A.S.; DePaoli, S.M.

    1995-01-01

    This technical manual presents updated generic source terms (i.e., unitized amounts and radionuclide compositions) which have been developed for use in the Integrated Data Base (IDB) Program of the U.S. Department of Energy (DOE). These source terms were used in the IDB annual report, Integrated Data Base for 1992: Spent Fuel and Radioactive Waste Inventories, Projections, and Characteristics, DOE/RW-0006, Rev. 8, October 1992. They are useful as a basis for projecting future amounts (volume and radioactivity) of low-level radioactive waste (LLW) shipped for disposal at commercial burial grounds or sent for storage at DOE solid-waste sites. Commercial fuel cycle LLW categories include boiling-water reactor, pressurized-water reactor, fuel fabrication, and uranium hexafluoride (UF 6 ) conversion. Commercial nonfuel cycle LLW includes institutional/industrial (I/I) waste. The LLW from DOE operations is category as uranium/thorium fission product, induced activity, tritium, alpha, and open-quotes otherclose quotes. Fuel cycle commercial LLW source terms are normalized on the basis of net electrical output [MW(e)-year], except for UF 6 conversion, which is normalized on the basis of heavy metal requirement [metric tons of initial heavy metal ]. The nonfuel cycle commercial LLW source term is normalized on the basis of volume (cubic meters) and radioactivity (curies) for each subclass within the I/I category. The DOE LLW is normalized in a manner similar to that for commercial I/I waste. The revised source terms are based on the best available historical data through 1992

  1. Uncertainty propagation in probabilistic risk assessment: A comparative study

    International Nuclear Information System (INIS)

    Ahmed, S.; Metcalf, D.R.; Pegram, J.W.

    1982-01-01

    Three uncertainty propagation techniques, namely method of moments, discrete probability distribution (DPD), and Monte Carlo simulation, generally used in probabilistic risk assessment, are compared and conclusions drawn in terms of the accuracy of the results. For small uncertainty in the basic event unavailabilities, the three methods give similar results. For large uncertainty, the method of moments is in error, and the appropriate method is to propagate uncertainty in the discrete form either by DPD method without sampling or by Monte Carlo. (orig.)

  2. Impact of source terms on distances to which reactor accident consequences occur

    International Nuclear Information System (INIS)

    Ostmeyer, R.M.

    1982-01-01

    Estimates of the distances over which reactor accident consequences might occur are important for development of siting criteria and for emergency response planning. This paper summarizes the results of a series of CRAC2 calculations performed to estimate these distances. Because of the current controversy concerning the magnitude of source terms for severe accidents, the impact of source term reductions upon distance estimates is also examined

  3. Probabilistic uniformities of uniform spaces

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.

    2017-07-01

    The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)

  4. Bisimulations meet PCTL equivalences for probabilistic automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2013-01-01

    Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...

  5. The long-term problems of contaminated land: Sources, impacts and countermeasures

    International Nuclear Information System (INIS)

    Baes, C.F. III.

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'')

  6. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Y. Chen

    2001-12-19

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  7. Using Reactive Transport Modeling to Evaluate the Source Term at Yucca Mountain

    International Nuclear Information System (INIS)

    Y. Chen

    2001-01-01

    The conventional approach of source-term evaluation for performance assessment of nuclear waste repositories uses speciation-solubility modeling tools and assumes pure phases of radioelements control their solubility. This assumption may not reflect reality, as most radioelements (except for U) may not form their own pure phases. As a result, solubility limits predicted using the conventional approach are several orders of magnitude higher then the concentrations of radioelements measured in spent fuel dissolution experiments. This paper presents the author's attempt of using a non-conventional approach to evaluate source term of radionuclide release for Yucca Mountain. Based on the general reactive-transport code AREST-CT, a model for spent fuel dissolution and secondary phase precipitation has been constructed. The model accounts for both equilibrium and kinetic reactions. Its predictions have been compared against laboratory experiments and natural analogues. It is found that without calibrations, the simulated results match laboratory and field observations very well in many aspects. More important is the fact that no contradictions between them have been found. This provides confidence in the predictive power of the model. Based on the concept of Np incorporated into uranyl minerals, the model not only predicts a lower Np source-term than that given by conventional Np solubility models, but also produces results which are consistent with laboratory measurements and observations. Moreover, two hypotheses, whether Np enters tertiary uranyl minerals or not, have been tested by comparing model predictions against laboratory observations, the results favor the former. It is concluded that this non-conventional approach of source term evaluation not only eliminates over-conservatism in conventional solubility approach to some extent, but also gives a realistic representation of the system of interest, which is a prerequisite for truly understanding the long-term

  8. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  9. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    Science.gov (United States)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  10. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...

  11. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  12. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  13. Probabilistic Geoacoustic Inversion in Complex Environments

    Science.gov (United States)

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  14. Optimization (Alara) and probabilistic exposures: the application of optimization criteria to the control of risks due to exposures of a probabilistic nature

    International Nuclear Information System (INIS)

    Gonzalez, A.J.

    1989-01-01

    The paper described the application of the principles of optimization recommended by the International Commission on Radiological Protection (ICRP) to the restrain of radiation risks due to exposures that may or may not be incurred and to which a probability of occurrence can be assigned. After describing the concept of probabilistic exposures, it proposes a basis for a converging policy of control for both certain and probabilistic exposures, namely the dose-risk relationship adopted for radiation protection purposes. On that basis some coherent approaches for dealing with probabilistic exposures, such as the limitation of individual risks, are discussed. The optimization of safety for reducing all risks from probabilistic exposures to as-low-as-reasonably-achievable (ALARA) levels is reviewed in full. The principles of optimization of protection are used as a basic framework and the relevant factors to be taken into account when moving to probabilistic exposures are presented. The paper also reviews the decision-aiding techniques suitable for performing optimization with particular emphasis to the multi-attribute utility-analysis technique. Finally, there is a discussion on some practical application of decision-aiding multi-attribute utility analysis to probabilistic exposures including the use of probabilistic utilities. In its final outlook, the paper emphasizes the need for standardization and solutions to generic problems, if optimization of safety is to be successful

  15. Affective and cognitive factors influencing sensitivity to probabilistic information.

    Science.gov (United States)

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  16. Probabilistic structural analysis of aerospace components using NESSUS

    Science.gov (United States)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  17. The long-term problems of contaminated land: Sources, impacts and countermeasures

    Energy Technology Data Exchange (ETDEWEB)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  18. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  19. An overview-probabilistic safety analysis for research reactors

    International Nuclear Information System (INIS)

    Liu Jinlin; Peng Changhong

    2015-01-01

    For long-term application, Probabilistic Safety Analysis (PSA) has proved to be a valuable tool for improving the safety and reliability of power reactors. In China, 'Nuclear safety and radioactive pollution prevention 'Twelfth Five Year Plan' and the 2020 vision' raises clearly that: to develop probabilistic safety analysis and aging evaluation for research reactors. Comparing with the power reactors, it reveals some specific features in research reactors: lower operating power, lower coolant temperature and pressure, etc. However, the core configurations may be changed very often and human actions play an important safety role in research reactors due to its specific experimental requirement. As a result, there is a necessary to conduct the PSA analysis of research reactors. This paper discusses the special characteristics related to the structure and operation and the methods to develop the PSA of research reactors, including initiating event analysis, event tree analysis, fault tree analysis, dependent failure analysis, human reliability analysis and quantification as well as the experimental and external event evaluation through the investigation of various research reactors and their PSAs home and abroad, to provide the current situation and features of research reactors PSAs. (author)

  20. Development of probabilistic seismic hazard analysis for international sites, challenges and guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez Ares, Antonio, E-mail: antonio.fernandez@rizzoassoc.com [Paul C. Rizzo Associates, Inc., 500 Penn Center Boulevard, Penn Center East, Suite 100, Pittsburgh, PA 15235 (United States); Fatehi, Ali, E-mail: ali.fatehi@rizzoassoc.com [Paul C. Rizzo Associates, Inc., 500 Penn Center Boulevard, Penn Center East, Suite 100, Pittsburgh, PA 15235 (United States)

    2013-06-15

    Research highlights: ► Site-specific seismic hazard study and suggestions for overcoming those challenges that are inherent to the significant amounts of epistemic uncertainty for sites at remote locations. ► Main aspects of probabilistic seismic hazard analysis (PSHA). ► Regional and site geology in the context of a probabilistic seismic hazard analysis (PSHA), including state-of-the-art ground motion estimation methods, and geophysical conditions. ► Senior seismic hazard analysis (SSHAC) as a mean to incorporate the opinions and contributions of the informed scientific community. -- Abstract: This article provides guidance to conduct a site-specific seismic hazard study, giving suggestions for overcoming those challenges that are inherent to the significant amounts of epistemic uncertainty for sites at remote locations. The text follows the general process of a seismic hazard study, describing both the deterministic and probabilistic approaches. Key and controversial items are identified in the areas of recorded seismicity, seismic sources, magnitude, ground motion models, and local site effects. A case history corresponding to a seismic hazard study in the Middle East for a Greenfield site in a remote location is incorporated along the development of the recommendations. Other examples of analysis case histories throughout the World are presented as well.

  1. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    Science.gov (United States)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions

  2. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  3. bayesPop: Probabilistic Population Projections

    Directory of Open Access Journals (Sweden)

    Hana Ševčíková

    2016-12-01

    Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  4. bayesPop: Probabilistic Population Projections

    Science.gov (United States)

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  5. Recent advances in the source term area within the SARNET European severe accident research network

    International Nuclear Information System (INIS)

    Herranz, L.E.; Haste, T.; Kärkelä, T.

    2015-01-01

    Highlights: • Main achievements of source term research in SARNET are given. • Emphasis on the radiologically important iodine and ruthenium fission products. • Conclusions on FP release, transport in the RCS and containment behaviour. • Significance of large-scale integral experiments to validate the analyses used. • A thorough list of the most recent references on source term research results. - Abstract: Source Term has been one of the main research areas addressed within the SARNET network during the 7th EC Framework Programme of EURATOM. The entire source term domain was split into three major areas: oxidising impact on source term, iodine chemistry in the reactor coolant system and containment and data and code assessment. The present paper synthesises the main technical outcome stemming from the SARNET FWP7 project in the area of source term and includes an extensive list of references in which deeper insights on specific issues may be found. Besides, based on the analysis of the current state of the art, an outlook of future source term research is outlined, where major changes in research environment are discussed (i.e., the end of the Phébus FP project; the end of the SARNET projects; and the launch of HORIZON 2020). Most probably research projects will be streamlined towards: release and transport under oxidising conditions, containment chemistry, existing and innovative filtered venting systems and others. These will be in addition to a number of projects that have been completed or are ongoing under different national and international frameworks, like VERDON, CHIP and EPICUR started under the International Source Term Programme (ISTP), the OECD/CSNI programmes BIP, BIP2, STEM, THAI and THAI2, and the French national programme MIRE. The experimental PASSAM project under the 7th EC Framework programme, focused on source term mitigation systems, is highlighted as a good example of a project addressing potential enhancement of safety systems

  6. Recent advances in the source term area within the SARNET European severe accident research network

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, L.E., E-mail: luisen.herranz@ciemat.es [Centro de Investigaciones Energeticas Medio Ambientales y Tecnologica, CIEMAT, Avda. Complutense 40, E-28040 Madrid (Spain); Haste, T. [Institut de Radioprotection et de Sûreté Nucléaire, IRSN, BP 3, F-13115 St Paul lez Durance Cedex (France); Kärkelä, T. [VTT Technical Research Centre of Finland, P.O. Box 1000, FI-02044 VTT Espoo (Finland)

    2015-07-15

    Highlights: • Main achievements of source term research in SARNET are given. • Emphasis on the radiologically important iodine and ruthenium fission products. • Conclusions on FP release, transport in the RCS and containment behaviour. • Significance of large-scale integral experiments to validate the analyses used. • A thorough list of the most recent references on source term research results. - Abstract: Source Term has been one of the main research areas addressed within the SARNET network during the 7th EC Framework Programme of EURATOM. The entire source term domain was split into three major areas: oxidising impact on source term, iodine chemistry in the reactor coolant system and containment and data and code assessment. The present paper synthesises the main technical outcome stemming from the SARNET FWP7 project in the area of source term and includes an extensive list of references in which deeper insights on specific issues may be found. Besides, based on the analysis of the current state of the art, an outlook of future source term research is outlined, where major changes in research environment are discussed (i.e., the end of the Phébus FP project; the end of the SARNET projects; and the launch of HORIZON 2020). Most probably research projects will be streamlined towards: release and transport under oxidising conditions, containment chemistry, existing and innovative filtered venting systems and others. These will be in addition to a number of projects that have been completed or are ongoing under different national and international frameworks, like VERDON, CHIP and EPICUR started under the International Source Term Programme (ISTP), the OECD/CSNI programmes BIP, BIP2, STEM, THAI and THAI2, and the French national programme MIRE. The experimental PASSAM project under the 7th EC Framework programme, focused on source term mitigation systems, is highlighted as a good example of a project addressing potential enhancement of safety systems

  7. Probabilistic interpretation of data a physicist's approach

    CERN Document Server

    Miller, Guthrie

    2013-01-01

    This book is a physicists approach to interpretation of data using Markov Chain Monte Carlo (MCMC). The concepts are derived from first principles using a style of mathematics that quickly elucidates the basic ideas, sometimes with the aid of examples. Probabilistic data interpretation is a straightforward problem involving conditional probability. A prior probability distribution is essential, and examples are given. In this small book (200 pages) the reader is led from the most basic concepts of mathematical probability all the way to parallel processing algorithms for Markov Chain Monte Carlo. Fortran source code (for eigenvalue analysis of finite discrete Markov Chains, for MCMC, and for nonlinear least squares) is included with the supplementary material for this book (available online).

  8. Transmission capacity assessment by probabilistic planning. An approach

    International Nuclear Information System (INIS)

    Lammintausta, M.

    2002-01-01

    The Finnish electricity markets participate in the Scandinavian markets, Nord-Pool. The Finnish market is free for marketers, producers and consumers. All these participants can be seen as customers of the transmission network, which in turn can be considered to be a market place in which electricity can be sold and bought. The Finnish transmission network is owned and operated by an independent company, Fingrid that has the full responsibility of the Finnish transmission system. The available transfer capacity of a transmission route is traditionally limited by deterministic security constraints. More efficient and flexible network utilisation could be achieved with probabilistic planning methods. This report introduces a simple and practical probabilistic approach for transfer limit and risk assessment. The method is based on the economical benefit and risk predictions. It uses also the existing results of deterministic data and it could be used side by side with the deterministic method. The basic concept and necessary equations for expected risks of various market players have been derived for further developments. The outage costs and thereby the risks of the market participants depend on how the system operator reacts to the faults. In the Finnish power system consumers will usually experience no costs due to the faults because of meshed network and counter trade method preferred by the system operator. The costs to the producers and dealers are also low because of the counter trade method. The network company will lose the cost of reparation, additional losses and cost of regulation power because of counter trades. In case power flows will be rearranged drastically because of aggressive strategies used in the electricity markets, the only way to fulfil the needs of free markets is that the network operator buys regulation power for short-term problems and reinforces the network in the long-term situations. The reinforcement is done if the network can not be

  9. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  10. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  11. Probabilistic Forecasts of Solar Irradiance by Stochastic Differential Equations

    DEFF Research Database (Denmark)

    Iversen, Jan Emil Banning; Morales González, Juan Miguel; Møller, Jan Kloppenborg

    2014-01-01

    approach allows for characterizing both the interdependence structure of prediction errors of short-term solar irradiance and their predictive distribution. Three different stochastic differential equation models are first fitted to a training data set and subsequently evaluated on a one-year test set...... included in probabilistic forecasts may be paramount for decision makers to efficiently make use of this uncertain and variable generation. In this paper, a stochastic differential equation framework for modeling the uncertainty associated with the solar irradiance point forecast is proposed. This modeling...

  12. Use of the LUS in sequence allele designations to facilitate probabilistic genotyping of NGS-based STR typing results.

    Science.gov (United States)

    Just, Rebecca S; Irwin, Jodi A

    2018-05-01

    Some of the expected advantages of next generation sequencing (NGS) for short tandem repeat (STR) typing include enhanced mixture detection and genotype resolution via sequence variation among non-homologous alleles of the same length. However, at the same time that NGS methods for forensic DNA typing have advanced in recent years, many caseworking laboratories have implemented or are transitioning to probabilistic genotyping to assist the interpretation of complex autosomal STR typing results. Current probabilistic software programs are designed for length-based data, and were not intended to accommodate sequence strings as the product input. Yet to leverage the benefits of NGS for enhanced genotyping and mixture deconvolution, the sequence variation among same-length products must be utilized in some form. Here, we propose use of the longest uninterrupted stretch (LUS) in allele designations as a simple method to represent sequence variation within the STR repeat regions and facilitate - in the nearterm - probabilistic interpretation of NGS-based typing results. An examination of published population data indicated that a reference LUS region is straightforward to define for most autosomal STR loci, and that using repeat unit plus LUS length as the allele designator can represent greater than 80% of the alleles detected by sequencing. A proof of concept study performed using a freely available probabilistic software demonstrated that the LUS length can be used in allele designations when a program does not require alleles to be integers, and that utilizing sequence information improves interpretation of both single-source and mixed contributor STR typing results as compared to using repeat unit information alone. The LUS concept for allele designation maintains the repeat-based allele nomenclature that will permit backward compatibility to extant STR databases, and the LUS lengths themselves will be concordant regardless of the NGS assay or analysis tools

  13. Probabilistic structural analysis methods for select space propulsion system components

    Science.gov (United States)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  14. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  15. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  16. Medium-Term Probabilistic Forecasting of Extremely Low Prices in Electricity Markets: Application to the Spanish Case

    Directory of Open Access Journals (Sweden)

    Antonio Bello

    2016-03-01

    Full Text Available One of the most relevant challenges that have arisen in electricity markets during the last few years is the emergence of extremely low prices. Trying to predict these events is crucial for market agents in a competitive environment. This paper proposes a novel methodology to simultaneously accomplish punctual and probabilistic hourly predictions about the appearance of extremely low electricity prices in a medium-term scope. The proposed approach for making real ex ante forecasts consists of a nested compounding of different forecasting techniques, which incorporate Monte Carlo simulation, combined with spatial interpolation techniques. The procedure is based on the statistical identification of the process key drivers. Logistic regression for rare events, decision trees, multilayer perceptrons and a hybrid approach, which combines a market equilibrium model with logistic regression, are used. Moreover, this paper assesses whether periodic models in which parameters switch according to the day of the week can be even more accurate. The proposed techniques are compared to a Markov regime switching model and several naive methods. The proposed methodology empirically demonstrates its effectiveness by achieving promising results on a real case study based on the Spanish electricity market. This approach can provide valuable information for market agents when they face decision making and risk-management processes. Our findings support the additional benefit of using a hybrid approach for deriving more accurate predictions.

  17. Probabilistic-Multiobjective Comparison of User-Defined Operating Rules. Case Study: Hydropower Dam in Spain

    Directory of Open Access Journals (Sweden)

    Paola Bianucci

    2015-03-01

    Full Text Available A useful tool is proposed in this paper to assist dam managers in comparing and selecting suitable operating rules. This procedure is based on well-known multiobjective and probabilistic methodologies, which were jointly applied here to assess and compare flood control strategies in hydropower reservoirs. The procedure consisted of evaluating the operating rules’ performance using a simulation fed by a representative and sufficiently large flood event series. These flood events were obtained from a synthetic rainfall series stochastically generated by using the RainSimV3 model coupled with a deterministic hydrological model. The performance of the assessed strategies was characterized using probabilistic variables. Finally, evaluation and comparison were conducted by analyzing objective functions which synthesize different aspects of the rules’ performance. These objectives were probabilistically defined in terms of risk and expected values. To assess the applicability and flexibility of the tool, it was implemented in a hydropower dam located in Galicia (Northern Spain. This procedure allowed alternative operating rule to be derived which provided a reasonable trade-off between dam safety, flood control, operability and energy production.

  18. Added Value of uncertainty Estimates of SOurce term and Meteorology (AVESOME)

    DEFF Research Database (Denmark)

    Sørensen, Jens Havskov; Schönfeldt, Fredrik; Sigg, Robert

    In the early phase of a nuclear accident, two large sources of uncertainty exist: one related to the source term and one associated with the meteorological data. Operational methods are being developed in AVESOME for quantitative estimation of uncertainties in atmospheric dispersion prediction.......g. at national meteorological services, the proposed methodology is feasible for real-time use, thereby adding value to decision support. In the recent NKS-B projects MUD, FAUNA and MESO, the implications of meteorological uncertainties for nuclear emergency preparedness and management have been studied...... uncertainty in atmospheric dispersion model forecasting stemming from both the source term and the meteorological data is examined. Ways to implement the uncertainties of forecasting in DSSs, and the impacts on real-time emergency management are described. The proposed methodology allows for efficient real...

  19. Stochastic network interdiction optimization via capacitated network reliability modeling and probabilistic solution discovery

    International Nuclear Information System (INIS)

    Ramirez-Marquez, Jose Emmanuel; Rocco S, Claudio M.

    2009-01-01

    This paper introduces an evolutionary optimization approach that can be readily applied to solve stochastic network interdiction problems (SNIP). The network interdiction problem solved considers the minimization of the cost associated with an interdiction strategy such that the maximum flow that can be transmitted between a source node and a sink node for a fixed network design is greater than or equal to a given reliability requirement. Furthermore, the model assumes that the nominal capacity of each network link and the cost associated with their interdiction can change from link to link and that such interdiction has a probability of being successful. This version of the SNIP is for the first time modeled as a capacitated network reliability problem allowing for the implementation of computation and solution techniques previously unavailable. The solution process is based on an evolutionary algorithm that implements: (1) Monte-Carlo simulation, to generate potential network interdiction strategies, (2) capacitated network reliability techniques to analyze strategies' source-sink flow reliability and, (3) an evolutionary optimization technique to define, in probabilistic terms, how likely a link is to appear in the final interdiction strategy. Examples for different sizes of networks are used throughout the paper to illustrate the approach

  20. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  1. On probabilistic forecasting of wind power time-series

    DEFF Research Database (Denmark)

    Pinson, Pierre

    power dynamics. In both cases, the model parameters are adaptively and recursively estimated, time-adaptativity being the result of exponential forgetting of past observations. The probabilistic forecasting methodology is applied at the Horns Rev wind farm in Denmark, for 10-minute ahead probabilistic...... forecasting of wind power generation. Probabilistic forecasts generated from the proposed methodology clearly have higher skill than those obtained from a classical Gaussian assumption about wind power predictive densities. Corresponding point forecasts also exhibit significantly lower error criteria....

  2. Review of the Brunswick Steam Electric Plant Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.; Davis, P.R.; Satterwhite, D.G.; Gilmore, W.E.; Gregg, R.E.

    1989-11-01

    A review of the Brunswick Steam Electric Plant probabilistic risk Assessment was conducted with the objective of confirming the safety perspectives brought to light by the probabilistic risk assessment. The scope of the review included the entire Level I probabilistic risk assessment including external events. This is consistent with the scope of the probabilistic risk assessment. The review included an assessment of the assumptions, methods, models, and data used in the study. 47 refs., 14 figs., 15 tabs

  3. Deliverable D74.2. Probabilistic analysis methods for support structures

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2018-01-01

    Relevant Description: Report describing the probabilistic analysis for offshore substructures and results attained. This includes comparison with experimental data and with conventional design. Specific targets: 1) Estimate current reliability level of support structures 2) Development of basis...... for probabilistic calculations and evaluation of reliability for offshore support structures (substructures) 3) Development of a probabilistic model for stiffness and strength of soil parameters and for modeling geotechnical load bearing capacity 4) Comparison between probabilistic analysis and deterministic...

  4. A common fixed point for operators in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Ghaemi, M.B.; Lafuerza-Guillen, Bernardo; Razani, A.

    2009-01-01

    Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.

  5. Probabilistic blind deconvolution of non-stationary sources

    DEFF Research Database (Denmark)

    Olsson, Rasmus Kongsgaard; Hansen, Lars Kai

    2004-01-01

    We solve a class of blind signal separation problems using a constrained linear Gaussian model. The observed signal is modelled by a convolutive mixture of colored noise signals with additive white noise. We derive a time-domain EM algorithm `KaBSS' which estimates the source signals...

  6. Utilization of probabilistic methods for evaluating the safety of PWRs built in France

    International Nuclear Information System (INIS)

    Queniart, D.; Brisbois, J.; Lanore, J.M.

    1985-01-01

    Firstly, it is recalled that, in France, PWRs are designed on a deterministic basis by studying the consequences of a limited number of conventional incidents whose estimated frequency is specified in order-of-magnitude terms and for which it is shown that the consequences, for each category of frequency, predominate over those of the other situations in the same category. These situations are called dimensioning situations. The paper then describes the use made of probabilistic methods. External attacks and loss of redundant systems are examined in particular. A probabilistic approach is in fact well suited to the evaluation of risks due, among other things, to aircraft crashes and the industrial environment. Analysis of the reliability of redundant systems has shown that, in the light of the overall risk assessment objective, their loss should be examined with a view to instituting counteraction to reduce the risks associated with such loss (particularly the introduction of special control procedures). Probabilistic methods are used to evaluate the effectiveness of the counteraction proposed and such a study has been carried out for total loss of electric power supply. Finally, the probabilistic study of hazard initiated post factum by the French safety authorities for the standardized 900 MW(e) power units is described. The study, which is not yet complete, will serve as the basis for a permanent safety analysis tool taking into account control procedures and the total operating experience acquired using these power units. (author)

  7. Design parameters and source terms: Volume 1, Design parameters: Revision 0

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites

  8. A History of Probabilistic Inductive Logic Programming

    Directory of Open Access Journals (Sweden)

    Fabrizio eRiguzzi

    2014-09-01

    Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.

  9. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  10. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  11. Probabilistic coding of quantum states

    International Nuclear Information System (INIS)

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-01-01

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding

  12. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...

  13. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... Application of probabilistic precipitation forecasts from a deterministic model ... aim of this paper is to investigate the increase in the lead-time of flash flood warnings of the SAFFG using probabilistic precipitation forecasts ... The procedure is applied to a real flash flood event and the ensemble-based.

  14. Probabilistic consequence model of accidenal or intentional chemical releases.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  15. Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms

    Science.gov (United States)

    Heidmann, James D.; Hunter, Scott D.

    2001-01-01

    The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.

  16. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  17. Probabilistic Price Forecasting for Day-Ahead and Intraday Markets: Beyond the Statistical Model

    Directory of Open Access Journals (Sweden)

    José R. Andrade

    2017-10-01

    Full Text Available Forecasting the hourly spot price of day-ahead and intraday markets is particularly challenging in electric power systems characterized by high installed capacity of renewable energy technologies. In particular, periods with low and high price levels are difficult to predict due to a limited number of representative cases in the historical dataset, which leads to forecast bias problems and wide forecast intervals. Moreover, these markets also require the inclusion of multiple explanatory variables, which increases the complexity of the model without guaranteeing a forecasting skill improvement. This paper explores information from daily futures contract trading and forecast of the daily average spot price to correct point and probabilistic forecasting bias. It also shows that an adequate choice of explanatory variables and use of simple models like linear quantile regression can lead to highly accurate spot price point and probabilistic forecasts. In terms of point forecast, the mean absolute error was 3.03 €/MWh for day-ahead market and a maximum value of 2.53 €/MWh was obtained for intraday session 6. The probabilistic forecast results show sharp forecast intervals and deviations from perfect calibration below 7% for all market sessions.

  18. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    International Nuclear Information System (INIS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-01-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh. (paper)

  19. Convex models and probabilistic approach of nonlinear fatigue failure

    International Nuclear Information System (INIS)

    Qiu Zhiping; Lin Qiang; Wang Xiaojun

    2008-01-01

    This paper is concerned with the nonlinear fatigue failure problem with uncertainties in the structural systems. In the present study, in order to solve the nonlinear problem by convex models, the theory of ellipsoidal algebra with the help of the thought of interval analysis is applied. In terms of the inclusion monotonic property of ellipsoidal functions, the nonlinear fatigue failure problem with uncertainties can be solved. A numerical example of 25-bar truss structures is given to illustrate the efficiency of the presented method in comparison with the probabilistic approach

  20. Undecidability of model-checking branching-time properties of stateless probabilistic pushdown process

    OpenAIRE

    Lin, T.

    2014-01-01

    In this paper, we settle a problem in probabilistic verification of infinite--state process (specifically, {\\it probabilistic pushdown process}). We show that model checking {\\it stateless probabilistic pushdown process} (pBPA) against {\\it probabilistic computational tree logic} (PCTL) is undecidable.

  1. Probabilistic safety assessment as a standpoint for decision making

    International Nuclear Information System (INIS)

    Cepin, M.

    2001-01-01

    This paper focuses on the role of probabilistic safety assessment in decision-making. The prerequisites for use of the results of probabilistic safety assessment and the criteria for the decision-making based on probabilistic safety assessment are discussed. The decision-making process is described. It provides a risk evaluation of impact of the issue under investigation. Selected examples are discussed, which highlight the described process. (authors)

  2. A global empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  3. Uncertainty estimation in nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Guarro, S.B.; Cummings, G.E.

    1989-01-01

    Probabilistic Risk Assessment (PRA) was introduced in the nuclear industry and the nuclear regulatory process in 1975 with the publication of the Reactor Safety Study by the U.S. Nuclear Regulatory Commission. Almost fifteen years later, the state-of-the-art in this field has been expanded and sharpened in many areas, and about thirty-five plant-specific PRAs (Probabilistic Risk Assessments) have been performed by the nuclear utility companies or by the U.S. Nuclear Regulatory commission. Among the areas where the most evident progress has been made in PRA and PSA (Probabilistic Safety Assessment, as these studies are more commonly referred to in the international community outside the U.S.) is the development of a consistent framework for the identification of sources of uncertainty and the estimation of their magnitude as it impacts various risk measures. Techniques to propagate uncertainty in reliability data through the risk models and display its effect on the top level risk estimates were developed in the early PRAs. The Seismic Safety Margin Research Program (SSMRP) study was the first major risk study to develop an approach to deal explicitly with uncertainty in risk estimates introduced not only by uncertainty in component reliability data, but by the incomplete state of knowledge of the assessor(s) with regard to basic phenomena that may trigger and drive a severe accident. More recently NUREG-1150, another major study of reactor risk sponsored by the NRC, has expanded risk uncertainty estimation and analysis into the realm of model uncertainty related to the relatively poorly known post-core-melt phenomena which determine the behavior of the molten core and of the rector containment structures

  4. A two-stage inexact joint-probabilistic programming method for air quality management under uncertainty.

    Science.gov (United States)

    Lv, Y; Huang, G H; Li, Y P; Yang, Z F; Sun, W

    2011-03-01

    A two-stage inexact joint-probabilistic programming (TIJP) method is developed for planning a regional air quality management system with multiple pollutants and multiple sources. The TIJP method incorporates the techniques of two-stage stochastic programming, joint-probabilistic constraint programming and interval mathematical programming, where uncertainties expressed as probability distributions and interval values can be addressed. Moreover, it can not only examine the risk of violating joint-probability constraints, but also account for economic penalties as corrective measures against any infeasibility. The developed TIJP method is applied to a case study of a regional air pollution control problem, where the air quality index (AQI) is introduced for evaluation of the integrated air quality management system associated with multiple pollutants. The joint-probability exists in the environmental constraints for AQI, such that individual probabilistic constraints for each pollutant can be efficiently incorporated within the TIJP model. The results indicate that useful solutions for air quality management practices have been generated; they can help decision makers to identify desired pollution abatement strategies with minimized system cost and maximized environmental efficiency. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  6. Identification of failure type in corroded pipelines: a bayesian probabilistic approach.

    Science.gov (United States)

    Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J

    2010-07-15

    Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.

  7. NRC source term assessment for incident response dose projections

    International Nuclear Information System (INIS)

    Easley, P.; Pasedag, W.

    1984-01-01

    The NRC provides advice and assistance to licensees and State and local authorities in responding to accidents. The TACT code supports this function by providing source term projections for two situations during early (15 to 60 minutes) accident response: (1) Core/containment damage is indicated, but there are no measured releases. Quantification of a predicted release permits emergency response before people are exposed. With TACT, response personnel can estimate releases based on fuel and cladding conditions, coolant boundary and containment integrity, and mitigative systems operability. For this type of estimate, TACT is intermediate between default assumptions and time-consuming mechanistic codes. (2) A combination of plant status and limited release data are available. For this situation, iterations between predictions based on known conditions which are compared to measured releases gives reasonable confidence in supplemental source term information otherwise unavailable: nuclide mix, releases not monitored, and trending or abrupt changes. The assumptions and models used in TACT, and examples of its use, are given in this paper

  8. Probabilistic seismic hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D. [New Brunswick Power Corp., Point Lepreau Generating Station, Lepreau, New Brunswick (Canada); Lavine, A. [AMEC Foster Wheeler Environment and Infrastructure Americas, Oakland, California (United States); Egan, J. [SAGE Engineers, Oakland, California (United States)

    2015-09-15

    A Probabilistic Seismic Hazard Assessment (PSHA) has been performed for the Point Lepreau Generating Station (PLGS). The objective is to provide characterization of the earthquake ground shaking that will be used to evaluate seismic safety. The assessment is based on the current state of knowledge of the informed scientific and engineering community regarding earthquake hazards in the site region, and includes two primary components-a seismic source model and a ground motion model. This paper provides the methodology and results of the PLGS PSHA. The implications of the updated hazard information for site safety are discussed in a separate paper. (author)

  9. Probabilistic seismic hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau, NB (Canada); Lavine, A., E-mail: alexis.lavine@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure Americas, Oakland, CA (United States); Egan, J., E-mail: jegan@sageengineers.com [SAGE Engineers, Oakland, CA (United States)

    2015-07-01

    A Probabilistic Seismic Hazard Assessment (PSHA) has been performed for the Point Lepreau Generating Station (PLGS). The objective is to provide characterization of the earthquake ground shaking that will be used to evaluate seismic safety. The assessment is based on the current state of knowledge of the informed scientific and engineering community regarding earthquake hazards in the site region, and includes two primary components--a seismic source model and a ground motion model. This paper provides the methodology and results of the PLGS PSHA. The implications of the updated hazard information for site safety are discussed in a separate paper. (author)

  10. Applying probabilistic methods for assessments and calculations for accident prevention

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    The guidelines for the prevention of accidents require plant design-specific and radioecological calculations to be made in order to show that maximum acceptable expsoure values will not be exceeded in case of an accident. For this purpose, main parameters affecting the accident scenario have to be determined by probabilistic methods. This offers the advantage that parameters can be quantified on the basis of unambigious and realistic criteria, and final results can be defined in terms of conservativity. (DG) [de

  11. Stormwater Tank Performance: Design and Management Criteria for Capture Tanks Using a Continuous Simulation and a Semi-Probabilistic Analytical Approach

    Directory of Open Access Journals (Sweden)

    Flavio De Martino

    2013-10-01

    Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.

  12. Selected source term topics. Report to CSNI by an OECD/NEA Group of experts

    International Nuclear Information System (INIS)

    1987-04-01

    CSNI Report 136 summarizes the results of the work performed by the Group of Experts on the Source Term and Environmental Consequences (PWG4) during the period extending from 1983 and 1986. This report is complementary to Part 1, 'Technical Status of the Source Term' of CSNI Report 135, 'Report to CSNI on Source Term Assessment, Containment atmosphere control systems, and accident consequences'; it considers in detail a number of very specific issues thought to be important in the source term area. It consists of: an executive summary (prepared by the Chairman of the Group), a section on conclusions and recommendations, and five technical chapters (fission product chemistry in the primary circuit of a LWR during severe accidents; resuspension/re-entrainment of aerosols in LWRs following a meltdown accident; iodine chemistry under severe accident conditions; effects of combustion, steam explosions and pressurized melt ejection on fission product behaviour; radionuclide removal by pool scrubbing), a technical annex and two appendices

  13. Analysis of truncation limit in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, Marko

    2005-01-01

    A truncation limit defines the boundaries of what is considered in the probabilistic safety assessment and what is neglected. The truncation limit that is the focus here is the truncation limit on the size of the minimal cut set contribution at which to cut off. A new method was developed, which defines truncation limit in probabilistic safety assessment. The method specifies truncation limits with more stringency than presenting existing documents dealing with truncation criteria in probabilistic safety assessment do. The results of this paper indicate that the truncation limits for more complex probabilistic safety assessments, which consist of larger number of basic events, should be more severe than presently recommended in existing documents if more accuracy is desired. The truncation limits defined by the new method reduce the relative errors of importance measures and produce more accurate results for probabilistic safety assessment applications. The reduced relative errors of importance measures can prevent situations, where the acceptability of change of equipment under investigation according to RG 1.174 would be shifted from region, where changes can be accepted, to region, where changes cannot be accepted, if the results would be calculated with smaller truncation limit

  14. Development of the methodology for application of revised source term to operating nuclear power plants in Korea

    International Nuclear Information System (INIS)

    Kang, M.S.; Kang, P.; Kang, C.S.; Moon, J.H.

    2004-01-01

    Considering the current trend in applying the revised source term proposed by NUREG-1465 to the nuclear power plants in the U.S., it is expected that the revised source term will be applied to the Korean operating nuclear power plants in the near future, even though the exact time can not be estimated. To meet the future technical demands, it is necessary to prepare the technical system including the related regulatory requirements in advance. In this research, therefore, it is intended to develop the methodology to apply the revised source term to operating nuclear power plants in Korea. Several principles were established to develop the application methodologies. First, it is not necessary to modify the existing regulations about source term (i.e., any back-fitting to operating nuclear plants is not necessary). Second, if the pertinent margin of safety is guaranteed, the revised source term suggested by NUREG-1465 may be useful to full application. Finally, a part of revised source term could be selected to application based on the technical feasibility. As the results of this research, several methodologies to apply the revised source term to the Korean operating nuclear power plants have been developed, which include: 1) the selective (or limited) application to use only some of all the characteristics of the revised source term, such as release timing of fission products and chemical form of radio-iodine and 2) the full application to use all the characteristics of the revised source term. The developed methodologies are actually applied to Ulchin 9 and 4 units and their application feasibilities are reviewed. The results of this research are used as either a manual in establishing the plan and the procedure for applying the revised source term to the domestic nuclear plant from the utility's viewpoint; or a technical basis of revising the related regulations from the regulatory body's viewpoint. The application of revised source term to operating nuclear

  15. Probabilistic vs linear blending approaches to shared control for wheelchair driving.

    Science.gov (United States)

    Ezeh, Chinemelu; Trautman, Pete; Devigne, Louise; Bureau, Valentin; Babel, Marie; Carlson, Tom

    2017-07-01

    Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.

  16. Limited probabilistic risk assessment applications in plant backfitting

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  17. Probabilistic analysis of a materially nonlinear structure

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  18. Probabilistic dual heuristic programming-based adaptive critic

    Science.gov (United States)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  19. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  20. Probabilistic inversion in priority setting of emerging zoonoses.

    NARCIS (Netherlands)

    Kurowicka, D.; Bucura, C.; Cooke, R.; Havelaar, A.H.

    2010-01-01

    This article presents methodology of applying probabilistic inversion in combination with expert judgment in priority setting problem. Experts rank scenarios according to severity. A linear multi-criteria analysis model underlying the expert preferences is posited. Using probabilistic inversion, a

  1. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Bucknor, Matthew [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Jerden, James [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Brunett, Acacia J. [Argonne National Lab. (ANL), Argonne, IL (United States). Nuclear Engineering Division; Denman, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Clark, Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Engineering Division; Denning, Richard S. [Consultant, Columbus, OH (United States)

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is not without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.

  2. Source-term model for the SYVAC3-NSURE performance assessment code

    International Nuclear Information System (INIS)

    Rowat, J.H.; Rattan, D.S.; Dolinar, G.M.

    1996-11-01

    Radionuclide contaminants in wastes emplaced in disposal facilities will not remain in those facilities indefinitely. Engineered barriers will eventually degrade, allowing radioactivity to escape from the vault. The radionuclide release rate from a low-level radioactive waste (LLRW) disposal facility, the source term, is a key component in the performance assessment of the disposal system. This report describes the source-term model that has been implemented in Ver. 1.03 of the SYVAC3-NSURE (Systems Variability Analysis Code generation 3-Near Surface Repository) code. NSURE is a performance assessment code that evaluates the impact of near-surface disposal of LLRW through the groundwater pathway. The source-term model described here was developed for the Intrusion Resistant Underground Structure (IRUS) disposal facility, which is a vault that is to be located in the unsaturated overburden at AECL's Chalk River Laboratories. The processes included in the vault model are roof and waste package performance, and diffusion, advection and sorption of radionuclides in the vault backfill. The model presented here was developed for the IRUS vault; however, it is applicable to other near-surface disposal facilities. (author). 40 refs., 6 figs

  3. Probabilistic distributions of pin gaps within a wire-spaced fuel subassembly and sensitivities of the related uncertainties to pin gap

    International Nuclear Information System (INIS)

    Sakai, K.; Hishida, H.

    1978-01-01

    Probabilistic fuel pin gap distributions within a wire-spaced fuel subassembly and sensitivities of the related uncertainties to fuel pin gaps are discussed. The analyses consist mainly of expressing a local fuel pin gap in terms of sensitivity functions of the related uncertainties and calculating the corresponding probabilistic distribution through taking all the possible combinations of the distribution of uncertainties. The results of illustrative calculations show that with the reliability level of 0.9987, the maximum deviation of the pin gap at the cladding hot spot of a center fuel subassembly is 8.05% from its nominal value and the corresponding probabilistic pin gap distribution is shifted to the narrower side due to the external confinement of a pin bundle with a wrapper tube. (Auth.)

  4. Considerations about source term now used aiming to emergency planning

    International Nuclear Information System (INIS)

    Austregesilo Filho, H.

    1987-01-01

    The applicability of source terms, in parametric studies for improving external emergengy plan for Angra-I reactor is presented. The source term is defined as, the quantity and radioactive material disposable for releasing to the environment in case of austere accident in a nuclear power plant. The following hypothesis: occuring accident, 100% of the noble gases, 50% of halogens and 1% of solid fission products contained into the reactor core, are released immediately toward the containment building; the radioactivity releasing to the environment is done at a constant rate of 0.1% in mass per day; the actuation of mitigated systems of radioactivity releasing, such as, spray of container or system of air recirculation by filters, is not considered; and the releasing is done at soil level. (M.C.K.) [pt

  5. A high-resolution probabilistic in vivo atlas of human subcortical brain nuclei.

    Science.gov (United States)

    Pauli, Wolfgang M; Nili, Amanda N; Tyszka, J Michael

    2018-04-17

    Recent advances in magnetic resonance imaging methods, including data acquisition, pre-processing and analysis, have benefited research on the contributions of subcortical brain nuclei to human cognition and behavior. At the same time, these developments have led to an increasing need for a high-resolution probabilistic in vivo anatomical atlas of subcortical nuclei. In order to address this need, we constructed high spatial resolution, three-dimensional templates, using high-accuracy diffeomorphic registration of T 1 - and T 2 - weighted structural images from 168 typical adults between 22 and 35 years old. In these templates, many tissue boundaries are clearly visible, which would otherwise be impossible to delineate in data from individual studies. The resulting delineations of subcortical nuclei complement current histology-based atlases. We further created a companion library of software tools for atlas development, to offer an open and evolving resource for the creation of a crowd-sourced in vivo probabilistic anatomical atlas of the human brain.

  6. A note on variational multiscale methods for high-contrast heterogeneous porous media flows with rough source terms

    KAUST Repository

    Calo, Victor M.

    2011-09-01

    In this short note, we discuss variational multiscale methods for solving porous media flows in high-contrast heterogeneous media with rough source terms. Our objective is to separate, as much as possible, subgrid effects induced by the media properties from those due to heterogeneous source terms. For this reason, enriched coarse spaces designed for high-contrast multiscale problems are used to represent the effects of heterogeneities of the media. Furthermore, rough source terms are captured via auxiliary correction equations that appear in the formulation of variational multiscale methods [23]. These auxiliary equations are localized and one can use additive or multiplicative constructions for the subgrid corrections as discussed in the current paper. Our preliminary numerical results show that one can capture the effects due to both spatial heterogeneities in the coefficients (such as permeability field) and source terms (e.g., due to singular well terms) in one iteration. We test the cases for both smooth source terms and rough source terms and show that with the multiplicative correction, the numerical approximations are more accurate compared to the additive correction. © 2010 Elsevier Ltd.

  7. The NUREG-1150 probabilistic risk assessment for the Sequoyah nuclear plant

    International Nuclear Information System (INIS)

    Gregory, J.J.; Breeding, R.J.; Higgins, S.J.; Shiver, A.W.; Helton, J.C.; Murfin, W.B.

    1992-01-01

    This paper summarizes the findings of the probabilistic risk assessment (PRA) for Unit 1 of the Sequoyah Nuclear Plant performed in support of NUREG-1150. The emphasis is on the 'back-end' analyses, the accident progression, source term, and consequence analyses, and the risk results obtained when the results of these analyses are combined with the accident frequency analysis. The results of this PRA indicate that the offsite risk from internal initiating events at Sequoyah are quite low with respect to the safety goals. The containment appears likely to withstand the loads that might be placed upon it if the reactor vessel fails. A good portion of the risk, in this analysis, comes from initiating events which bypass the containment. These events are estimated to have a relatively low frequency of occurrence, but their consequences are quite large. Other events that contribute to offsite risk involve early containment failures that occur during degradation of the core or near the time of vessel breach. Considerable uncertainty is associated with the risk estimates produced in this analysis. Offsite risk from external initiating events was not included in this analysis. (orig.)

  8. Probabilistic cloning of three symmetric states

    International Nuclear Information System (INIS)

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-01-01

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  9. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  10. Reliability data update using condition monitoring and prognostics in probabilistic safety assessment

    Directory of Open Access Journals (Sweden)

    Hyeonmin Kim

    2015-03-01

    Full Text Available Probabilistic safety assessment (PSA has had a significant role in quantitative decision-making by finding design and operational vulnerabilities and evaluating cost-benefit in improving such weak points. In particular, it has been widely used as the core methodology for risk-informed applications (RIAs. Even though the nature of PSA seeks realistic results, there are still “conservative” aspects. One of the sources for the conservatism is the assumptions of safety analysis and the estimation of failure frequency. Surveillance, diagnosis, and prognosis (SDP, utilizing massive databases and information technology, is worth highlighting in terms of its capability for alleviating the conservatism in conventional PSA. This article provides enabling techniques to solidify a method to provide time- and condition-dependent risks by integrating a conventional PSA model with condition monitoring and prognostics techniques. We will discuss how to integrate the results with frequency of initiating events (IEs and probability of basic events (BEs. Two illustrative examples will be introduced: (1 how the failure probability of a passive system can be evaluated under different plant conditions and (2 how the IE frequency for a steam generator tube rupture (SGTR can be updated in terms of operating time. We expect that the proposed model can take a role of annunciator to show the variation of core damage frequency (CDF depending on operational conditions.

  11. Reliability data update using condition monitoring and prognostics in probabilistic safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeon Min; Lee, Sang Hwan; Park, Jun Seok; Kim, Hyung Dae; Chang, Yoon Suk; Heo, Gyun Young [Dept. of Nuclear Engineering, Kyung Hee University, Yongin (Korea, Republic of)

    2015-03-15

    Probabilistic safety assessment (PSA) has had a significant role in quantitative decision making by finding design and operational vulnerabilities and evaluating cost-benefit in improving such weak points. In particular, it has been widely used as the core methodology for risk-informed applications (RIAs). Even though the nature of PSA seeks realistic results, there are still 'conservative' aspects. One of the sources for the conservatism is the assumptions of safety analysis and the estimation of failure frequency. Surveillance, diagnosis, and prognosis (SDP), utilizing massive databases and information technology, is worth highlighting in terms of its capability for alleviating the conservatism in conventional PSA. This article provides enabling techniques to solidify a method to provide time and condition-dependent risks by integrating a conventional PSA model with condition monitoring and prognostics techniques. We will discuss how to integrate the results with frequency of initiating events (IEs) and probability of basic events (BEs). Two illustrative examples will be introduced: (1) how the failure probability of a passive system can be evaluated under different plant conditions and (2) how the IE frequency for a steam generator tube rupture (SGTR) can be updated in terms of operating time. We expect that the proposed model can take a role of annunciator to show the variation of core damage frequency (CDF) depending on operational conditions.

  12. Some thoughts on the future of probabilistic structural design of nuclear components

    International Nuclear Information System (INIS)

    Stancampiano, P.A.

    1978-01-01

    This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)

  13. EVALUATION OF MILITARY ACTIVITY IMPACT ON HUMANS THROUGH A PROBABILISTIC ECOLOGICAL RISK ASSESSMENT. EXAMPLE OF A FORMER MISSILE BASE.

    Directory of Open Access Journals (Sweden)

    Sergiy ОREL

    2015-10-01

    Full Text Available The current article provides a methodology focused on the assessment of environmental factors after termination of military activity and uses a former missile base as an example. The assessment of environmental conditions is performed through an evaluation of the risks posed by the hazardous chemicals contained by underground and surface water sources and soil to human health . Moreover, by conducting deterministic and probabilistic risk assessments, the article determines that the probabilistic assessment provides more accurate and qualitative information for decision-making on the use of environmental protection measures, which often saves financial and material resources needed for their implementation.

  14. Probabilistic seismic hazard assessment. Gentilly 2

    International Nuclear Information System (INIS)

    1996-03-01

    Results of this probabilistic seismic hazard assessment were determined using a suite of conservative assumptions. The intent of this study was to perform a limited hazard assessment that incorporated a range of technically defensible input parameters. To best achieve this goal, input selected for the hazard assessment tended to be conservative with respect to selection of attenuation modes, and seismicity parameters. Seismic hazard estimates at Gentilly 2 were most affected by selection of the attenuation model. Alternative definitions of seismic source zones had a relatively small impact on seismic hazard. A St. Lawrence Rift model including a maximum magnitude of 7.2 m b in the zone containing the site had little effect on the hazard estimate relative to other seismic source zonation models. Mean annual probabilities of exceeding the design peak ground acceleration, and the design response spectrum for the Gentilly 2 site were computed to lie in the range of 0.001 to 0.0001. This hazard result falls well within the range determined to be acceptable for nuclear reactor sites located throughout the eastern United States. (author) 34 refs., 6 tabs., 28 figs

  15. Cooperation in an evolutionary prisoner’s dilemma game with probabilistic strategies

    International Nuclear Information System (INIS)

    Li Haihong; Dai Qionglin; Cheng Hongyan; Yang Junzhong

    2012-01-01

    Highlights: ► Introducing probabilistic strategies instead of the pure C/D in the PDG. ► The strategies patterns depends on interaction structures and updating rules. ► There exists an optimal increment of the probabilistic strategy. - Abstract: In this work, we investigate an evolutionary prisoner’s dilemma game in structured populations with probabilistic strategies instead of the pure strategies of cooperation and defection. We explore the model in details by considering different strategy update rules and different population structures. We find that the distribution of probabilistic strategies patterns is dependent on both the interaction structures and the updating rules. We also find that, when an individual updates her strategy by increasing or decreasing her probabilistic strategy a certain amount towards that of her opponent, there exists an optimal increment of the probabilistic strategy at which the cooperator frequency reaches its maximum.

  16. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  17. Probabilistic evaluation of scenarios in long-term safety analyses. Results of the project ISIBEL; Probabilistische Bewertung von Szenarien in Langzeitsicherheitsanalysen. Ergebnisse des Vorhabens ISIBEL

    Energy Technology Data Exchange (ETDEWEB)

    Buhmann, Dieter; Becker, Dirk-Alexander; Laggiard, Eduardo; Ruebel, Andre; Spiessl, Sabine; Wolf, Jens

    2016-07-15

    In the frame of the project ISIBEL deterministic analyses on the radiological consequences of several possible developments of the final repository were performed (VSG: preliminary safety analysis of the site Gorleben). The report describes the probabilistic evaluation of the VSG scenarios using uncertainty and sensitivity analyses. It was shown that probabilistic analyses are important to evaluate the influence of uncertainties. The transfer of the selected scenarios in computational cases and the used modeling parameters are discussed.

  18. Application of probabilistic seismic hazard models with special calculation for the waste storage sites in Egypt

    International Nuclear Information System (INIS)

    Othman, A.A.; El-Hemamy, S.T.

    2000-01-01

    Probabilistic strong motion maps of Egypt are derived by applying Gumbel models and likelihood method to 8 earthquake source zones in Egypt and adjacent regions. Peak horizontal acceleration is mapped. Seismic data are collected from Helwan Catalog (1900-1997), regional catalog of earthquakes from the International Seismological Center (ISC,1910-1993) and earthquake data reports of US Department of International Geological Survey (USCGS, 1900-1994). Iso-seismic maps are also available for some events, which occurred in Egypt. Some earthquake source zones are well defined on the basis of both tectonics and average seismicity rates, but a lack of understanding of the near field effects of the large earthquakes prohibits accurate estimates of ground motion in their vicinity. Some source zones have no large-scale crustal features or zones of weakness that can explain the seismicity and must, therefore, be defined simply as concentrations of seismic activity with no geological or geophysical controls on the boundaries. Other source zones lack information on low-magnitude seismicity that would be representative of longer periods of time. Comparisons of the new probabilistic ground motion estimates in Egypt with equivalent estimates made in 1990 have been done. The new ground motion estimates are used to produce a new peak ground acceleration map to replace the 1990 peak acceleration zoning maps in the Building code of Egypt. (author)

  19. Probabilistic safety assessment goals in Canada

    International Nuclear Information System (INIS)

    Snell, V.G.

    1986-01-01

    CANDU safety philosphy, both in design and in licensing, has always had a strong bias towards quantitative probabilistically-based goals derived from comparative safety. Formal probabilistic safety assessment began in Canada as a design tool. The influence of this carried over later on into the definition of the deterministic safety guidelines used in CANDU licensing. Design goals were further developed which extended the consequence/frequency spectrum of 'acceptable' events, from the two points defined by the deterministic single/dual failure analysis, to a line passing through lower and higher frequencies. Since these were design tools, a complete risk summation was not necessary, allowing a cutoff at low event frequencies while preserving the identification of the most significant safety-related events. These goals gave a logical framework for making decisions on implementing design changes proposed as a result of the Probabilistic Safety Analysis. Performing this analysis became a regulatory requirement, and the design goals remained the framework under which this was submitted. Recently, there have been initiatives to incorporate more detailed probabilistic safety goals into the regulatory process in Canada. These range from far-reaching safety optimization across society, to initiatives aimed at the nuclear industry only. The effectiveness of the latter is minor at very low and very high event frequencies; at medium frequencies, a justification against expenditures per life saved in other industries should be part of the goal setting

  20. Probabilistic Design of Coastal Flood Defences in Vietnam

    NARCIS (Netherlands)

    Mai Van, C.

    2010-01-01

    This study further develops the method of probabilistic design and to address a knowledge gap in its application regarding safety and reliability, risk assessment and risk evaluation to the fields of flood defences. The thesis discusses: - a generic probabilistic design framework for assessing flood

  1. Survey and evaluation of inherent safety characteristics and passive safety systems for use in probabilistic safety analyses

    International Nuclear Information System (INIS)

    Wetzel, N.; Scharfe, A.

    1998-01-01

    The present report examines the possibilities and limits of a probabilistic safety analysis to evaluate passive safety systems and inherent safety characteristics. The inherent safety characteristics are based on physical principles, that together with the safety system lead to no damage. A probabilistic evaluation of the inherent safety characteristic is not made. An inventory of passive safety systems of accomplished nuclear power plant types in the Federal Republic of Germany was drawn up. The evaluation of the passive safety system in the analysis of the accomplished nuclear power plant types was examined. The analysis showed that the passive manner of working was always assumed to be successful. A probabilistic evaluation was not performed. The unavailability of the passive safety system was determined by the failure of active components which are necessary in order to activate the passive safety system. To evaluate the passive safety features in new concepts of nuclear power plants the AP600 from Westinghouse, the SBWR from General Electric and the SWR 600 from Siemens, were selected. Under these three reactor concepts, the SWR 600 is specially attractive because the safety features need no energy sources and instrumentation in this concept. First approaches for the assessment of the reliability of passively operating systems are summarized. Generally it can be established that the core melt frequency for the passive concepts AP600 and SBWR is advantageous in comparison to the probabilistic objectives from the European Pressurized Water Reactor (EPR). Under the passive concepts is the SWR 600 particularly interesting. In this concept the passive systems need no energy sources and instrumentation, and has active operational systems and active safety equipment. Siemens argues that with this concept the frequency of a core melt will be two orders of magnitude lower than for the conventional reactors. (orig.) [de

  2. Probabilistic Teleportation via Quantum Channel with Partial Information

    Directory of Open Access Journals (Sweden)

    Desheng Liu

    2015-06-01

    Full Text Available Two novel schemes are proposed to teleport an unknown two-level quantum state probabilistically when the sender and the receiver only have partial information about the quantum channel, respectively. This is distinct from the fact that either the sender or the receiver has entire information about the quantum channel in previous schemes for probabilistic teleportation. Theoretical analysis proves that these schemes are straightforward, efficient and cost-saving. The concrete realization procedures of our schemes are presented in detail, and the result shows that our proposals could extend the application range of probabilistic teleportation.

  3. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  4. Study on the estimation of probabilistic effective dose. Committed effective dose from intake of marine products using Oceanic General Circulation Model

    International Nuclear Information System (INIS)

    Nakano, Masanao

    2007-01-01

    The worldwide environmental protection is required by the public. A long-term environmental assessment from nuclear fuel cycle facilities to the aquatic environment also becomes more important to utilize nuclear energy more efficiently. Evaluation of long-term risk including not only in Japan but also in neighboring countries is considered to be necessary in order to develop nuclear power industry. The author successfully simulated the distribution of radionuclides in seawater and seabed sediment produced by atmospheric nuclear tests using LAMER (Long-term Assessment ModEl for Radioactivity in the oceans). A part of the LAMER calculated the advection- diffusion-scavenging processes for radionuclides in the oceans and the Japan Sea in cooperate with Oceanic General Circulation Model (OGCM) and was validated. The author is challenging to calculate probabilistic effective dose suggested by ICRP from intake of marine products due to atmospheric nuclear tests using the Monte Carlo method in the other part of LAMER. Depending on the deviation of each parameter, the 95th percentile of the probabilistic effective dose was calculated about half of the 95th percentile of the deterministic effective dose in proforma calculation. The probabilistic assessment gives realistic value for the dose assessment of a nuclear fuel cycle facility. (author)

  5. Corroborating a new probabilistic seismic hazard assessment for greater Tokyo from historical intensity observations

    Science.gov (United States)

    Bozkurt, S.; Stein, R.; Toda, S.

    2006-12-01

    The long recorded history of earthquakes in Japan affords an opportunity to forecast seismic shaking exclusively from past observations of shaking. For this we analyzed 10,000 intensity observations recorded during AD 1600-2000 in a 350 x 350 km area centered on Tokyo in a Geographic Information System. A frequency-intensity curve is found for each 5 x 5 km cell, and from this the probability of exceeding any intensity level can be estimated. The principal benefits of this approach is that it builds the fewest possible assumptions into a probabilistic seismic forecast, it includes site and source effects without imposing this behavior, and we do not need to know the size or location of any earthquake or the location and slip rate of any fault. The cost is that we must abandon any attempt to make a time-dependent forecast, which could be quite different. We believe the method is suitable to many applications of probabilistic seismic hazard assessment, and to other regions. The two key assumptions are that the slope of the observed frequency-intensity relation at every site is the same, and that the 400-year record is long enough to encompass the full range of seismic behavior. Tests we conduct suggest that both assumptions are sound. The resulting 30-year probability of IJMA>=6 shaking (roughly equivalent to PGA>=0.9 g or MMI=IX-X) is 30-40% in Tokyo, Kawasaki, and Yokohama, and 10-15% in Chiba and Tsukuba, the range reflecting spatial variability and curve-fitting alternatives. The strongest shaking is forecast along the margins of Tokyo Bay, within the river sediments extending northwest from Tokyo, and at coastal sites near the plate boundary faults. We also produce long- term exceedance maps of peak ground acceleration for building code regulations, and short-term hazard maps associated with hypothetical catastrophe bonds. Our results for greater Tokyo resemble our independent Poisson probability developed from conventional seismic hazard analysis, as well as

  6. Probabilistic diffusion tractography of the optic radiations and visual function in preterm infants at term equivalent age.

    Science.gov (United States)

    Bassi, Laura; Ricci, Daniela; Volzone, Anna; Allsop, Joanna M; Srinivasan, Latha; Pai, Aakash; Ribes, Carmen; Ramenghi, Luca A; Mercuri, Eugenio; Mosca, Fabio; Edwards, A David; Cowan, Frances M; Rutherford, Mary A; Counsell, Serena J

    2008-02-01

    Children born prematurely have a high incidence of visual disorders which cannot always be explained by focal retinal or brain lesions. The aim of this study was to test the hypothesis that visual function in preterm infants is related to the microstructural development of white matter in the optic radiations. We used diffusion tensor imaging (DTI) with probabilistic diffusion tractography to delineate the optic radiations at term equivalent age and compared the fractional anisotropy (FA) to a contemporaneous evaluation of visual function. Thirty-seven preterm infants (19 male) born at median (range) 28(+4) (24(+1)-32(+3)) weeks gestational age, were examined at a post-menstrual age of 42 (39(+6)-43) weeks. MRI and DTI were acquired on a 3 Tesla MR system with DTI obtained in 15 non-collinear directions with a b value of 750 s/mm(2). Tracts were generated from a seed mask placed in the white matter lateral to the lateral geniculate nucleus and mean FA values of these tracts were determined. Visual assessment was performed using a battery of nine items assessing different aspects of visual abilities. Ten infants had evidence of cerebral lesions on conventional MRI. Multiple regression analysis demonstrated that the visual assessment score was independently correlated with FA values, but not gestational age at birth, post-menstrual age at scan or the presence of lesions on conventional MRI. The occurrence of mild retinopathy of prematurity did not affect the FA measures or visual scores. We then performed a secondary analysis using tract-based spatial statistics to determine whether global brain white matter development was related to visual function and found that only FA in the optic radiations was correlated with visual assessment score. Our results suggest that in preterm infants at term equivalent age visual function is directly related to the development of white matter in the optic radiations.

  7. Probabilistic calculation of dose commitment from uranium mill tailings

    International Nuclear Information System (INIS)

    1983-10-01

    The report discusses in a general way considerations of uncertainty in relation to probabilistic modelling. An example of a probabilistic calculation applied to the behaviour of uranium mill tailings is given

  8. Application of Bayesian network to the probabilistic risk assessment of nuclear waste disposal

    International Nuclear Information System (INIS)

    Lee, Chang-Ju; Lee, Kun Jai

    2006-01-01

    The scenario in a risk analysis can be defined as the propagating feature of specific initiating event which can go to a wide range of undesirable consequences. If we take various scenarios into consideration, the risk analysis becomes more complex than do without them. A lot of risk analyses have been performed to actually estimate a risk profile under both uncertain future states of hazard sources and undesirable scenarios. Unfortunately, in case of considering specific systems such as a radioactive waste disposal facility, since the behaviour of future scenarios is hardly predicted without special reasoning process, we cannot estimate their risk only with a traditional risk analysis methodology. Moreover, we believe that the sources of uncertainty at future states can be reduced pertinently by setting up dependency relationships interrelating geological, hydrological, and ecological aspects of the site with all the scenarios. It is then required current methodology of uncertainty analysis of the waste disposal facility be revisited under this belief. In order to consider the effects predicting from an evolution of environmental conditions of waste disposal facilities, this paper proposes a quantitative assessment framework integrating the inference process of Bayesian network to the traditional probabilistic risk analysis. We developed and verified an approximate probabilistic inference program for the specific Bayesian network using a bounded-variance likelihood weighting algorithm. Ultimately, specific models, including a model for uncertainty propagation of relevant parameters were developed with a comparison of variable-specific effects due to the occurrence of diverse altered evolution scenarios (AESs). After providing supporting information to get a variety of quantitative expectations about the dependency relationship between domain variables and AESs, we could connect the results of probabilistic inference from the Bayesian network with the consequence

  9. Chiefly Symmetric: Results on the Scalability of Probabilistic Model Checking for Operating-System Code

    Directory of Open Access Journals (Sweden)

    Marcus Völp

    2012-11-01

    Full Text Available Reliability in terms of functional properties from the safety-liveness spectrum is an indispensable requirement of low-level operating-system (OS code. However, with evermore complex and thus less predictable hardware, quantitative and probabilistic guarantees become more and more important. Probabilistic model checking is one technique to automatically obtain these guarantees. First experiences with the automated quantitative analysis of low-level operating-system code confirm the expectation that the naive probabilistic model checking approach rapidly reaches its limits when increasing the numbers of processes. This paper reports on our work-in-progress to tackle the state explosion problem for low-level OS-code caused by the exponential blow-up of the model size when the number of processes grows. We studied the symmetry reduction approach and carried out our experiments with a simple test-and-test-and-set lock case study as a representative example for a wide range of protocols with natural inter-process dependencies and long-run properties. We quickly see a state-space explosion for scenarios where inter-process dependencies are insignificant. However, once inter-process dependencies dominate the picture models with hundred and more processes can be constructed and analysed.

  10. Probabilistic safety goals. Phase 3 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2009-07-15

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  11. Probabilistic safety goals. Phase 3 - Status report

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2009-07-01

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  12. Use of probabilistic studies in the analysis of modifications of French nuclear power plants

    International Nuclear Information System (INIS)

    Gros, G.; Milhem, J.M.

    1985-11-01

    The 900 MWe water pressurized reactors have been designed on deterministic basis. It appeared that some safety systems had a probability of failure non negligible and that their total failure could involve, in a short-term, severe consequences. This situation led Electricite de France to propose complementary measures (control-procedures, and associated modifications). To judge the efficiency of such measures, the safety authorities thought it was advisable to rest on probabilistic studies which have been developed by the Department of Safety Analysis of the C.E.A. The contribution of such studies, in the choice of modifications by information on the weak points and in the judgement on the efficiency of these modifications by probabilistic estimation of meltdown, is illustrated with the example of electric power supplies [fr

  13. Centrifugal Filtration System for Severe Accident Source Term Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Shu Chang; Yim, Man Sung [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of this paper is to present the conceptual design of a filtration system that can be used to process airborne severe accident source term. Reactor containment may lose its structural integrity due to over-pressurization during a severe accident. This can lead to uncontrolled radioactive releases to the environment. For preventing the dispersion of these uncontrolled radioactive releases to the environment, several ways to capture or mitigate these radioactive source term releases are under investigation at KAIST. Such technologies are based on concepts like a vortex-like air curtain, a chemical spray, and a suction arm. Treatment of the radioactive material captured by these systems would be required, before releasing to environment. For current filtration systems in the nuclear industry, IAEA lists sand, multi-venturi scrubber, high efficiency particulate arresting (HEPA), charcoal and combinations of the above in NS-G-1-10, 4.143. Most if not all of the requirements of the scenario for applying this technology near the containment of an NPP site and the environmental constraints were analyzed for use in the design of the centrifuge filtration system.

  14. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  15. Development and Application of Level 2 Probabilistic Safety Assessment for Nuclear Power Plants. Specific Safety Guide

    International Nuclear Information System (INIS)

    2010-01-01

    The objective of this Safety Guide is to provide recommendations for meeting the IAEA safety requirements in performing or managing a level 2 probabilistic safety assessment (PSA) project for a nuclear power plant; thus it complements the Safety Guide on level 1 PSA. One of the aims of this Safety Guide is to promote a standard framework, standard terms and a standard set of documents for level 2 PSAs to facilitate regulatory and external peer review of their results. It describes all elements of the level 2 PSA that need to be carried out if the starting point is a fully comprehensive level 1 PSA. Contents: 1. Introduction; 2. PSA project management and organization; 3. Identification of design aspects important to severe accidents and acquisition of information; 4. Interface with level 1 PSA: Grouping of sequences; 5. Accident progression and containment analysis; 6. Source terms for severe accidents; 7. Documentation of the analysis: Presentation and interpretation of results; 8. Use and applications of the PSA; Annex I: Example of a typical schedule for a level 2 PSA; Annex II: Computer codes for simulation of severe accidents; Annex III: Sample outline of documentation for a level 2 PSA study.

  16. Simplified application of probabilistic safety analysis in nuclear power plants by means of artificial neural networks

    International Nuclear Information System (INIS)

    Oehmgen, T.; Knorr, J.

    2004-01-01

    Probabilistic safety analyses (PSA) are conducted to assess the balanced nature of plant design in terms of technical safety and the administrative management of plant operation in nuclear power plants. In the evaluation shown in this article of the operating experience accumulated in two nuclear power plants, all failures are traced back consistently to the plant media and component levels, respectively, for the calculation of reliability coefficients. Moreover, the use of neural networks for probabilistic calculations is examined. The results are verified on the basis of test examples. Calculations with neural networks are very easy to carry out in a kind of 'black box'. There is a possibility, for instance, to use the system in plant maintenance. (orig.) [de

  17. Reliability Evaluation and Probabilistic Design of Coastal Structures

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    1993-01-01

    Conventional design practice for coastal structures is deterministic in nature and is based on the concept of a design load, which should not exceed the resistance (carrying capacity) of the structure. The design load is usually defined on a probabilistic basis as a characteristic value of the load......, e.g. the expectation (mean) value of the lOO-year return period event, however, often without consideration of the involved uncertainties. The resistance is in most cases defined in terms of the load which causes a certain design impact or damage to the structure and is not given as an ultimate...... force or deformation. This is because most of the available design formulae only give the relationship between wave characteristics and structural response, e.g. in terms of run-up, overtopping, armour layer damage etc. An example is the Hudson formula for armour layer stability. Almost all such design...

  18. SHEDS-HT: an integrated probabilistic exposure model for prioritizing exposures to chemicals with near-field and dietary sources.

    Science.gov (United States)

    Isaacs, Kristin K; Glen, W Graham; Egeghy, Peter; Goldsmith, Michael-Rock; Smith, Luther; Vallero, Daniel; Brooks, Raina; Grulke, Christopher M; Özkaynak, Halûk

    2014-11-04

    United States Environmental Protection Agency (USEPA) researchers are developing a strategy for high-throughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirect exposures from near-field sources, SHEDS-HT employs a fugacity-based module to estimate concentrations in indoor environmental media. The concentration estimates, along with relevant exposure factors and human activity data, are then used by the model to rapidly generate probabilistic population distributions of near-field indirect exposures via dermal, nondietary ingestion, and inhalation pathways. Pathway-specific estimates of near-field direct exposures from consumer products are also modeled

  19. Documentation design for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Parkinson, W.J.; von Herrmann, J.L.

    1985-01-01

    This paper describes a framework for documentation design of probabilistic risk assessment (PRA) and is based on the EPRI document NP-3470 ''Documentation Design for Probabilistic Risk Assessment''. The goals for PRA documentation are stated. Four audiences are identified which PRA documentation must satisfy, and the documentation consistent with the needs of the various audiences are discussed, i.e., the Summary Report, the Executive Summary, the Main Report, and Appendices. The authors recommend the documentation specifications discussed herein as guides rather than rigid definitions

  20. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    Science.gov (United States)

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.