Probabilistic Event Categorization
Wiebe, J; Duan, L; Wiebe, Janyce; Bruce, Rebecca; Duan, Lei
1997-01-01
This paper describes the automation of a new text categorization task. The categories assigned in this task are more syntactically, semantically, and contextually complex than those typically assigned by fully automatic systems that process unseen test data. Our system for assigning these categories is a probabilistic classifier, developed with a recent method for formulating a probabilistic model from a predefined set of potential features. This paper focuses on feature selection. It presents a number of fully automatic features. It identifies and evaluates various approaches to organizing collocational properties into features, and presents the results of experiments covarying type of organization and type of property. We find that one organization is not best for all kinds of properties, so this is an experimental parameter worth investigating in NLP systems. In addition, the results suggest a way to take advantage of properties that are low frequency but strongly indicative of a class. The problems of rec...
MGR External Events Hazards Analysis
Energy Technology Data Exchange (ETDEWEB)
L. Booth
1999-11-06
The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Model Learning for Probabilistic Simulation on Rare Events and Scenarios
2015-03-06
observed data only. This is a problem called covariate shift in statistics . We need to calibrate the probability to avoid the underestimation of our...fall records causing the events/scenarios. This is called a covariate shift problem in statistics . We need to calibrate the probability to avoid the...their causes and consequences. Their probabilities are also quantitatively provided based on the mathematically rigorous and probabilistic inference
Implementation of external hazards in Probabilistic Safety Assessment for nuclear power plants
Kumar, Manorma; Klug, Joakim; Raimond, Emmanuel
2015-04-01
The paper will focus on the discussion on implementation of external hazards in the probabilistic safety assessment (PSA) methods for the extreme external hazards mainly focused on Seismic, Flooding, Meteorological Hazards (e.g. Storm, Extreme temperature, snow pack), Biological infestation, Lightening hazards, Accidental Aircraft crash and man- made hazards including natural external fire and external explosion. This will include discussion on identification of some good practices on the implementation of external hazards in Level 1 PSA, with a perspective of development of extended PSA and introduction of relevant modelling for external hazards in an existing Level 1 PSA. This paper is associated to the European project ASAMPSAE (www.asampsa.eu) which gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and which aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants.
Severe accident risks from external events
Institute of Scientific and Technical Information of China (English)
Randall O Gauntt
2013-01-01
This paper reviews the early development of design requirements for seismic events in USA early developing nuclear electric generating fleet.Notable safety studies,including WASH-1400,Sandia Siting Study and the NUREG-1150 probabilistic risk study,are briefly reviewed in terms of their relevance to extreme accidents arising from seismic and other severe accident initiators.Specific characteristic about the nature of severe accidents in nuclear power plant (NPP) are reviewed along with present day state-of-art analysis methodologies (methods for estimation of leakages and consequences of releases (MELCOR) and MELCOR accident consequence code system (MACCS)) that are used to evaluate severe accidents and to optimize mitigative and protective actions against such accidents.It is the aim of this paper to make nuclear operating nations aware of the risks that accompany a much needed energy resource and to identify some of the tools,techniques and landmark safety studies that serve to make the technology safer and to maintain vigilance and adequate safety culture for the responsible management of this valuable but unforgiving technology.
Probabilistic modelling of flood events using the entropy copula
Li, Fan; Zheng, Qian
2016-11-01
The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.
Probabilistic delay differential equation modeling of event-related potentials.
Ostwald, Dirk; Starke, Ludger
2016-08-01
"Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach.
Generating Random Earthquake Events for Probabilistic Tsunami Hazard Assessment
LeVeque, Randall J.; Waagan, Knut; González, Frank I.; Rim, Donsub; Lin, Guang
2016-12-01
To perform probabilistic tsunami hazard assessment for subduction zone earthquakes, it is necessary to start with a catalog of possible future events along with the annual probability of occurrence, or a probability distribution of such events that can be easily sampled. For near-field events, the distribution of slip on the fault can have a significant effect on the resulting tsunami. We present an approach to defining a probability distribution based on subdividing the fault geometry into many subfaults and prescribing a desired covariance matrix relating slip on one subfault to slip on any other subfault. The eigenvalues and eigenvectors of this matrix are then used to define a Karhunen-Loève expansion for random slip patterns. This is similar to a spectral representation of random slip based on Fourier series but conforms to a general fault geometry. We show that only a few terms in this series are needed to represent the features of the slip distribution that are most important in tsunami generation, first with a simple one-dimensional example where slip varies only in the down-dip direction and then on a portion of the Cascadia Subduction Zone.
Probabilistic forecast of daily areal precipitation focusing on extreme events
Bliefernicht, J.; Bárdossy, A.
2007-04-01
A dynamical downscaling scheme is usually used to provide a short range flood forecasting system with high-resolved precipitation fields. Unfortunately, a single forecast of this scheme has a high uncertainty concerning intensity and location especially during extreme events. Alternatively, statistical downscaling techniques like the analogue method can be used which can supply a probabilistic forecasts. However, the performance of the analogue method is affected by the similarity criterion, which is used to identify similar weather situations. To investigate this issue in this work, three different similarity measures are tested: the euclidean distance (1), the Pearson correlation (2) and a combination of both measures (3). The predictor variables are geopotential height at 1000 and 700 hPa-level and specific humidity fluxes at 700 hPa-level derived from the NCEP/NCAR-reanalysis project. The study is performed for three mesoscale catchments located in the Rhine basin in Germany. It is validated by a jackknife method for a period of 44 years (1958-2001). The ranked probability skill score, the Brier Skill score, the Heidke skill score and the confidence interval of the Cramer association coefficient are calculated to evaluate the system for extreme events. The results show that the combined similarity measure yields the best results in predicting extreme events. However, the confidence interval of the Cramer coefficient indicates that this improvement is only significant compared to the Pearson correlation but not for the euclidean distance. Furthermore, the performance of the presented forecasting system is very low during the summer and new predictors have to be tested to overcome this problem.
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
Energy Technology Data Exchange (ETDEWEB)
Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin
2016-01-01
Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
Directory of Open Access Journals (Sweden)
Matthew Bucknor
2017-03-01
Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.
Advanced reactor passive system reliability demonstration analysis for an external event
Energy Technology Data Exchange (ETDEWEB)
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)
2017-03-15
Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.
Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor
Energy Technology Data Exchange (ETDEWEB)
Lambright, J.A.; Bohn, M.P.; Daniel, S.L. (Sandia National Labs., Albuquerque, NM (USA)); Baxter, J.T. (Westinghouse Hanford Co., Richland, WA (USA)); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. (EQE, Inc., San Francisco, CA (USA)); Brosseau, D.A. (ERCE, Inc., Albuquerque, NM (USA))
1990-11-01
A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.
Non-stationary probabilistic characterization of drought events
Bonaccorso, Brunella; Cancelliere, Antonino
2016-04-01
Probabilistic characterization of droughts is an essential step for designing and implementing appropriate mitigation strategies. Traditionally, probabilistic characterization of droughts has been carried out assuming stationarity for the underlying hydrological series. In particular, under the stationary framework, probability distributions and moments of hydrological processes are assumed to be invariant with time. However many studies in the past decades have highlighted the presence of non-stationary patterns (such as trends or shifts) in hydrological records, leading to question the stationarity paradigm. Regardless of the causes (either anthropogenic or natural), the need arises to develop new statistical concepts and tools able to deal with such non-stationarity. In the present work, an analytical framework for deriving probabilities and return periods of droughts, assuming non-stationarity in the underlying hydrological series, is developed. In particular, exact and approximate analytical expressions for the moments and probability distributions of drought characteristics (i.e. length and accumulated deficit), are derived as a function of the non-stationary probability distribution of the hydrological process under investigation, as well as of the threshold level. Furthermore, capitalizing on previous developments suggested in the statistical and climate change literature, the concept of return period is revisited to take into account non-stationarity, as well as the multivariate nature of droughts which requires to consider different characteristics simultaneously. The derived expressions are applied to several precipitation series in Sicily Italy, exhibiting trends. Results indicate the feasibility of the proposed methodology to compute probabilities and return periods of drought characteristics in a non-stationary context.
Probabilistic events in shock driven multiphase hydrodynamic instabilities
Black, Wolfgang; Denissen, Nick; McFarland, Jacob
2016-11-01
Multiphase flows are an important and complex topic of research with a rich parameter space. Historically many simplifications and assumptions have been made to allow simulation techniques to be applied to these systems. Some common assumptions include no partilce-particle effects, evenly distributed particle fields, no phase change, or even constant particle radii. For some flows, these assumptions may be applicable but as the systems undergo complex accelerations and eventually become turbulent these multiphase parameters can create significant effects. Through the use of FLAG, a multiphysics hydrodynamics code developed at Los Alamos national laboratory, these assumptions can be relaxed or eliminated to increase fidelity and guide the development of experiments. This talk will build on our previous work utilizing simulations on the shock driven multiphase instability with a new investigation into a greater parameter space provided by additional multiphase effects; including a probabilistic particle field, various particle radii, and particle-particle effects on the evolution of commonly studied interfaces. Los Alamos National Laboratory LA-UR-16-25652.
Energy Technology Data Exchange (ETDEWEB)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Using multiple data sets to populate probabilistic volcanic event trees
Newhall, C.G.; Pallister, John S.
2014-01-01
The key parameters one needs to forecast outcomes of volcanic unrest are hidden kilometers beneath the Earth’s surface, and volcanic systems are so complex that there will invariably be stochastic elements in the evolution of any unrest. Fortunately, there is sufficient regularity in behaviour that some, perhaps many, eruptions can be forecast with enough certainty for populations to be evacuated and kept safe. Volcanologists charged with forecasting eruptions must try to understand each volcanic system well enough that unrest can be interpreted in terms of pre-eruptive process, but must simultaneously recognize and convey uncertainties in their assessment. We have found that use of event trees helps to focus discussion, integrate data from multiple sources, reach consensus among scientists about both pre-eruptive process and uncertainties and, in some cases, to explain all of this to officials. Figure 1 shows a generic volcanic event tree from Newhall and Hoblitt (2002) that can be modified as needed for each specific volcano. This paper reviews how we and our colleagues have used such trees during a number of volcanic crises worldwide, for rapid hazard assessments in situations in which more formal expert elicitations could not be conducted. We describe how Multiple Data Sets can be used to estimate probabilities at each node and branch. We also present case histories of probability estimation during crises, how the estimates were used by public officials, and some suggestions for future improvements.
Event simultaneity does not eliminate age deficits in implicit probabilistic sequence learning.
Forman-Alberti, Alissa B; Seaman, Kendra L; Howard, Darlene V; Howard, James H
2014-01-01
Recent studies have shown age-related deficits in learning subtle probabilistic sequential relationships. However, virtually all sequence learning studies have displayed successive events one at a time. Here we used a modified Triplets Learning Task to investigate if an age deficit occurs even when sequentially-presented predictive events remain in view simultaneously. Twelve young and 12 old adults observed two cue events and responded to a target event on each of a series of trials. All three events remained in view until the subject responded. Unbeknownst to participants, the first cue predicted one of four targets on 80% of the trials. Learning was indicated by faster and more accurate responding to these high-probability targets than to low-probability targets. Results revealed age deficits in sequence learning even with this simultaneous display, suggesting that age differences are not due solely to general processing declines, but rather reflect an age-related deficit in associative learning.
Probabilistic Swinging Door Algorithm as Applied to Photovoltaic Power Ramping Event Detection
Energy Technology Data Exchange (ETDEWEB)
Florita, Anthony; Zhang, Jie; Brancucci Martinez-Anido, Carlo; Hodge, Bri-Mathias; Cui, Mingjian
2015-10-02
Photovoltaic (PV) power generation experiences power ramping events due to cloud interference. Depending on the extent of PV aggregation and local grid features, such power variability can be constructive or destructive to measures of uncertainty regarding renewable power generation; however, it directly influences contingency planning, production costs, and the overall reliable operation of power systems. For enhanced power system flexibility, and to help mitigate the negative impacts of power ramping, it is desirable to analyze events in a probabilistic fashion so degrees of beliefs concerning system states and forecastability are better captured and uncertainty is explicitly quantified. A probabilistic swinging door algorithm is developed and presented in this paper. It is then applied to a solar data set of PV power generation. The probabilistic swinging door algorithm builds on results from the original swinging door algorithm, first used for data compression in trend logging, and it is described by two uncertain parameters: (i) e, the threshold sensitivity to a given ramp, and (ii) s, the residual of the piecewise linear ramps. These two parameters determine the distribution of ramps and capture the uncertainty in PV power generation.
Alzbutas, Robertas
2015-04-01
In general, the Emergency Planning Zones (EPZ) are defined as well as plant site and arrangement structures are designed to minimize the potential for natural and manmade hazards external to the plant from affecting the plant safety related functions, which can affect nearby population and environment. This may include consideration of extreme winds, fires, flooding, aircraft crash, seismic activity, etc. Thus the design basis for plant and site is deeply related to the effects of any postulated external events and the limitation of the plant capability to cope with accidents i.e. perform safety functions. It has been observed that the Probabilistic Safety Assessment (PSA) methodologies to deal with EPZ and extreme external events have not reached the same level of maturity as for severe internal events. The design basis for any plant and site is deeply related to the effects of any postulated external events and the limitation of the plant capability to cope with accidents i.e. perform safety functions. As a prime example of an advanced reactor and new Nuclear Power Plant (NPP) with enhanced safety, the International Reactor Innovative and Secure (IRIS) and Site selection for New NPP in Lithuania had been considered in this work. In the used Safety-by-Design™ approach, the PSA played obviously a key role; therefore a Preliminary IRIS PSA had been developed along with the design. For the design and pre-licensing process of IRIS the external events analysis included both qualitative evaluation and quantitative assessment. As a result of preliminary qualitative analyses, the external events that were chosen for more detailed quantitative scoping evaluation were high winds and tornadoes, aircraft crash, and seismic events. For the site selection in Lithuania a detail site evaluation process was performed and related to the EPZ and risk zoning considerations. In general, applying the quantitative assessment, bounding site characteristics could be used in order to
Mohamad, Mustafa A.; Cousins, Will; Sapsis, Themistoklis P.
2016-10-01
We consider the problem of the probabilistic quantification of dynamical systems that have heavy-tailed characteristics. These heavy-tailed features are associated with rare transient responses due to the occurrence of internal instabilities. Systems with these properties can be found in a variety of areas including mechanics, fluids, and waves. Here we develop a computational method, a probabilistic decomposition-synthesis technique, that takes into account the nature of internal instabilities to inexpensively determine the non-Gaussian probability density function for any arbitrary quantity of interest. Our approach relies on the decomposition of the statistics into a 'non-extreme core', typically Gaussian, and a heavy-tailed component. This decomposition is in full correspondence with a partition of the phase space into a 'stable' region where we have no internal instabilities, and a region where non-linear instabilities lead to rare transitions with high probability. We quantify the statistics in the stable region using a Gaussian approximation approach, while the non-Gaussian distribution associated with the intermittently unstable regions of phase space is inexpensively computed through order-reduction methods that take into account the strongly nonlinear character of the dynamics. The probabilistic information in the two domains is analytically synthesized through a total probability argument. The proposed approach allows for the accurate quantification of non-Gaussian tails at more than 10 standard deviations, at a fraction of the cost associated with the direct Monte-Carlo simulations. We demonstrate the probabilistic decomposition-synthesis method for rare events for two dynamical systems exhibiting extreme events: a two-degree-of-freedom system of nonlinearly coupled oscillators, and in a nonlinear envelope equation characterizing the propagation of unidirectional water waves.
The use of incomplete global data for probabilistic event trees: challenges and strategies
Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Wright, Heather; Pallister, John
2016-04-01
To prevent volcanic crises from becoming disasters, the USGS-USAID Volcano Disaster Assistance Program (VDAP) helps foreign counterparts to assess volcanic unrest, activity, and hazards before and during crises. Bayesian event trees are frequently used to facilitate discussion, reach consensus, evaluate uncertainty, and produce probabilistic forecasts of volcanic activity. VDAP uses a "method of multiple data sets" (Newhall & Pallister 2014), which combines conceptual and physical models of volcanic processes, current monitoring data, patterns of prior occurrence, and expert judgement from multiple disciplines to assign probabilities for each node of an event tree. The global volcanic record is used to inform our conceptual models, improve uncertainty estimates by leveraging larger datasets, and to fill in gaps where local information is sparse. For example, event trees for the recent Sinabung, Indonesia eruption relied upon local monitoring data-streams, but also on the global frequency-magnitude (VEI) of eruptions. A variety of databases are used, including the Smithsonian Institution's Global Volcanism Program (GVP) database, WOVOdat, GeoDIVA, DomeHaz, and FlowDat. Inhomogeneity and incompleteness of the global record present challenges for the use of such data in event trees, resulting in large and difficult to quantify uncertainties. Under-recording of small events, lack of documentation of 'failed eruptions', and variability of geophysical monitoring data-streams present particular problems. This contribution seeks to: (1) review VDAP's use of global data for probabilistic event tree creation; (2) summarize the problems presented by under-recording, spatial and temporal inhomogeneity, and incompleteness of the global record; (3) highlight ways to compensate for these effects, such as the development of hierarchical models to borrow strength from the global record while retaining local information, and the use of ranges in expert judgements to assess
External inverse-Compton emission from jetted tidal disruption events
Lu, Wenbin
2016-01-01
The recent discoveries of Swift J1644+57 and J2058+05 show that tidal disruption events (TDEs) can launch relativistic jets. Super-Eddington accretion produces a strong radiation field of order Eddington luminosity. In a jetted TDE, electrons in the jet will inverse-Compton scatter the external radiation field from the accretion disk and wind. Motivated by observations of thermal optical-UV spectra in Swift J2058+05 and several other TDEs, we assume the spectrum of the external radiation field intercepted by the relativistic jet to be blackbody. Hot electrons in the jet scatter this thermal radiation and produce luminosities 10^45-10^48 erg/s in the X/gamma-ray band. This model of thermal plus inverse-Compton radiation is applied to Swift J2058+05. First, we show that the blackbody component in the optical-UV spectrum most likely has its origin in the super-Eddington wind from the disk. Then, using the observed blackbody component as the external radiation field, we show that the X-ray luminosity and spectrum...
Willemen, T; Varon, C; Dorado, A Caicedo; Haex, B; Vander Sloten, J; Van Huffel, S
2015-10-01
Current clinical standards to assess sleep and its disorders lack either accuracy or user-friendliness. They are therefore difficult to use in cost-effective population-wide screening or long-term objective follow-up after diagnosis. In order to fill this gap, the use of cardiac and respiratory information was evaluated for discrimination between different sleep stages, and for detection of apneic breathing. Alternative probabilistic visual representations were also presented, referred to as the hypnocorrogram and apneacorrogram. Analysis was performed on the UCD sleep apnea database, available on Physionet. The presence of apneic events proved to have a significant impact on the performance of a cardiac and respiratory based algorithm for sleep stage classification. WAKE versus SLEEP discrimination resulted in a kappa value of κ = 0.0439, while REM versus NREM resulted in κ = 0.298 and light sleep (N1N2) versus deep sleep (N3) in κ = 0.339. The high proportion of hypopneic events led to poor detection of apneic breathing, resulting in a kappa value of κ = 0.272. While the probabilistic representations allow to put classifier output in perspective, further improvements would be necessary to make the classifier reliable for use on patients with sleep apnea.
Development of transient initiating event frequencies for use in probabilistic risk assessments
Energy Technology Data Exchange (ETDEWEB)
Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.
1985-05-01
Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors.
Mohamad, Mustafa A
2015-01-01
In this work, we consider systems that are subjected to intermittent instabilities due to external stochastic excitation. These intermittent instabilities, though rare, have a large impact on the probabilistic response of the system and give rise to heavy-tailed probability distributions. By making appropriate assumptions on the form of these instabilities, which are valid for a broad range of systems, we formulate a method for the analytical approximation of the probability distribution function (pdf) of the system response (both the main probability mass and the heavy-tail structure). In particular, this method relies on conditioning the probability density of the response on the occurrence of an instability and the separate analysis of the two states of the system, the unstable and stable state. In the stable regime we employ steady state assumptions, which lead to the derivation of the conditional response pdf using standard methods for random dynamical systems. The unstable regime is inherently transient...
External inverse-Compton emission from jetted tidal disruption events
Lu, Wenbin; Kumar, Pawan
2016-05-01
The recent discoveries of Sw J1644+57 and Sw J2058+05 show that tidal disruption events (TDEs) can launch relativistic jets. Super-Eddington accretion produces a strong radiation field of order Eddington luminosity. In a jetted TDE, electrons in the jet will inverse-Compton scatter the photons from the accretion disc and wind (external radiation field). Motivated by observations of thermal optical-UV spectra in Sw J2058+05 and several other TDEs, we assume the spectrum of the external radiation field intercepted by the relativistic jet to be blackbody. Hot electrons in the jet scatter this thermal radiation and produce luminosities 1045-1048 erg s- 1 in the X/γ-ray band. This model of thermal plus inverse-Compton radiation is applied to Sw J2058+05. First, we show that the blackbody component in the optical-UV spectrum most likely has its origin in the super-Eddington wind from the disc. Then, using the observed blackbody component as the external radiation field, we show that the X-ray luminosity and spectrum are consistent with the inverse-Compton emission, under the following conditions: (1) the jet Lorentz factor is Γ ≃ 5-10; (2) electrons in the jet have a power-law distribution dN_e/dγ _e ∝ γ _e^{-p} with γmin ˜ 1 and p = 2.4; (3) the wind is mildly relativistic (Lorentz factor ≳ 1.5) and has isotropic-equivalent mass-loss rate ˜ 5 M⊙ yr- 1. We describe the implications for jet composition and the radius where jet energy is converted to radiation.
Energy Technology Data Exchange (ETDEWEB)
Budnitz, R.J.; Lambert, H.E. (Future Resources Associates, Inc., Berkeley, CA (USA))
1990-01-01
The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab.
Scheingraber, Christoph; Käser, Martin; Allmann, Alexander
2017-04-01
Probabilistic seismic risk analysis (PSRA) is a well-established method for modelling loss from earthquake events. In the insurance industry, it is widely employed for probabilistic modelling of loss to a distributed portfolio. In this context, precise exposure locations are often unknown, which results in considerable loss uncertainty. The treatment of exposure uncertainty has already been identified as an area where PSRA would benefit from increased research attention. However, so far, epistemic location uncertainty has not been in the focus of a large amount of research. We propose a new framework for efficient treatment of location uncertainty. To demonstrate the usefulness of this novel method, a large number of synthetic portfolios resembling real-world portfolios is systematically analyzed. We investigate the effect of portfolio characteristics such as value distribution, portfolio size, or proportion of risk items with unknown coordinates on loss variability. Several sampling criteria to increase the computational efficiency of the framework are proposed and put into the wider context of well-established Monte-Carlo variance reduction techniques. The performance of each of the proposed criteria is analyzed.
Development of Simplified Probabilistic Risk Assessment Model for Seismic Initiating Event
Energy Technology Data Exchange (ETDEWEB)
S. Khericha; R. Buell; S. Sancaktar; M. Gonzalez; F. Ferrante
2012-06-01
ABSTRACT This paper discusses a simplified method to evaluate seismic risk using a methodology built on dividing the seismic intensity spectrum into multiple discrete bins. The seismic probabilistic risk assessment model uses Nuclear Regulatory Commission’s (NRC’s) full power Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The seismic PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from the full power SPAR model with seismic event tree logic. The peak ground acceleration is divided into five bins. The g-value for each bin is estimated using the geometric mean of lower and upper values of that particular bin and the associated frequency for each bin is estimated by taking the difference between upper and lower values of that bin. The component’s fragilities are calculated for each bin using the plant data, if available, or generic values of median peak ground acceleration and uncertainty values for the components. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheets that include the performance shaping factors (PSFs). The results are then used to estimate human error probabilities (HEPs) of interest. This work is expected to improve the NRC’s ability to include seismic hazards in risk assessments for operational events in support of the reactor oversight program (e.g., significance determination process).
Energy Technology Data Exchange (ETDEWEB)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.
Seamless Level 2/Level 3 probabilistic risk assessment using dynamic event tree analysis
Osborn, Douglas Matthew
The current approach to Level 2 and Level 3 probabilistic risk assessment (PRA) using the conventional event-tree/fault-tree methodology requires pre-specification of event order occurrence which may vary significantly in the presence of uncertainties. Manual preparation of input data to evaluate the possible scenarios arising from these uncertainties may also lead to errors from faulty/incomplete input preparation and their execution using serial runs may lead to computational challenges. A methodology has been developed for Level 2 analysis using dynamic event trees (DETs) that removes these limitations with systematic and mechanized quantification of the impact of aleatory uncertainties on possible consequences and their likelihoods. The methodology is implemented using the Analysis of Dynamic Accident Progression Trees (ADAPT) software. For the purposes of this work, aleatory uncertainties are defined as those arising from the stochastic nature of the processes under consideration, such as the variability of weather, in which the probability of weather patterns is predictable but the conditions at the time of the accident are a matter of chance. Epistemic uncertainties are regarded as those arising from the uncertainty in the model (system code) input parameters (e.g., friction or heat transfer correlation parameters). This work conducts a seamless Level 2/3 PRA using a DET analysis. The research helps to quantify and potentially reduce the magnitude of the source term uncertainty currently experienced in Level 3 PRA. Current techniques have been demonstrated with aleatory uncertainties for environmental releases of radioactive materials. This research incorporates epistemic and aleatory uncertainties in a phenomenologically consistent manner through use of DETs. The DETs were determined using the ADAPT framework and linking ADAPT with MELCOR, MELMACCS, and the MELCOR Accident Consequence Code System, Version 2. Aleatory and epistemic uncertainties incorporated
Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling
Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.
2010-01-01
NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand
Probabilistic mapping of urban flood risk: Application to extreme events in Surat, India
Ramirez, Jorge; Rajasekar, Umamaheshwaran; Coulthard, Tom; Keiler, Margreth
2016-04-01
Surat, India is a coastal city that lies on the banks of the river Tapti and is located downstream from the Ukai dam. Given Surat's geographic location, the population of five million people are repeatedly exposed to flooding caused by high tide combined with large emergency dam releases into the Tapti river. In 2006 such a flood event occurred when intense rainfall in the Tapti catchment caused a dam release near 25,000 m3 s-1 and flooded 90% of the city. A first step towards strengthening resilience in Surat requires a robust method for mapping potential flood risk that considers the uncertainty in future dam releases. Here, in this study we develop many combinations of dam release magnitude and duration for the Ukai dam. Afterwards we use these dam releases to drive a two dimensional flood model (CAESAR-Lisflood) of Surat that also considers tidal effects. Our flood model of Surat utilizes fine spatial resolution (30m) topography produced from an extensive differential global positioning system survey and measurements of river cross-sections. Within the city we have modelled scenarios that include extreme conditions with near maximum dam release levels (e.g. 1:250 year flood) and high tides. Results from all scenarios have been summarized into probabilistic flood risk maps for Surat. These maps are currently being integrated within the city disaster management plan for taking both mitigation and adaptation measures for different scenarios of flooding.
Keen, A. S.; Lynett, P. J.; Ayca, A.
2016-12-01
Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once
Energy Technology Data Exchange (ETDEWEB)
Chen, J.T.; Connell, E.; Chokshi, N. [NRC, Washington, DC (United States)] [and others
1997-02-01
As a result of the U.S. Nuclear Regulatory Commission (USNRC) initiated Individual plant Examination of External Events (IPEEE) program, every operating nuclear power reactor in the United States has performed an assessment of severe accident due to external events. This paper provides a summary of the preliminary insights gained through the review of 24 IPEEE submittals.
Kindermans, Pieter-Jan; Verschore, Hannes; Schrauwen, Benjamin
2013-10-01
In recent years, in an attempt to maximize performance, machine learning approaches for event-related potential (ERP) spelling have become more and more complex. In this paper, we have taken a step back as we wanted to improve the performance without building an overly complex model, that cannot be used by the community. Our research resulted in a unified probabilistic model for ERP spelling, which is based on only three assumptions and incorporates language information. On top of that, the probabilistic nature of our classifier yields a natural dynamic stopping strategy. Furthermore, our method uses the same parameters across 25 subjects from three different datasets. We show that our classifier, when enhanced with language models and dynamic stopping, improves the spelling speed and accuracy drastically. Additionally, we would like to point out that as our model is entirely probabilistic, it can easily be used as the foundation for complex systems in future work. All our experiments are executed on publicly available datasets to allow for future comparison with similar techniques.
The Impact of External Events on the Emergence of Collective States of Economic Sentiment
Hohnisch, M; Pittnauer, S; Hohnisch, Martin; Stauffer, Dietrich; Pittnauer, Sabine
2006-01-01
We investigate the impact of the environment (i.e. the impact of socio-political and socio-economic exogenous events) on the emergence of ordered phases of locally interacting individual economic sentiment variables (consumer confidence, business confidence etc.). The sentiment field is modeled as a (non-critical) Ising field with nearest-neighbor interactions on a (two-dimensional) square lattice. The environment is modeled as an external ``field of events'', randomly fluctuating over time, stochastically impacting the Ising field of individual variables. The external events can be frequent or rare, have a lasting impact or a non-lasting impact. The field is not homogeneous, as individual actors might fail to perceive external events. We find that if events are sufficiently ``strong'' and/or perceived by a sufficiently large proportion of agents, collective states of pessimism/optimism can not occur, even for strong inter-agent interactions.
Mohamad, Mustafa A; Sapsis, Themistoklis P
2015-01-01
We consider the problem of probabilistic quantification of dynamical systems that have heavy-tailed characteristics. These heavy-tailed features are associated with rare transient responses due to the occurrence of internal instabilities. Here we develop a computational method, a probabilistic decomposition-synthesis technique, that takes into account the nature of internal instabilities to inexpensively determine the non-Gaussian probability density function for any arbitrary quantity of interest. Our approach relies on the decomposition of the statistics into a `non-extreme core', typically Gaussian, and a heavy-tailed component. This decomposition is in full correspondence with a partition of the phase space into a `stable' region where we have no internal instabilities, and a region where non-linear instabilities lead to rare transitions with high probability. We quantify the statistics in the stable region using a Gaussian approximation approach, while the non-Gaussian distributions associated with the i...
Sanchez, Yadira M.; Lambert, Sharon F.; Cooley-Strickland, Michele
2013-01-01
African American youth residing in low income urban neighborhoods are at increased risk of experiencing negative life events in multiple domains, increasing their risk for internalizing and externalizing behaviors. However, little is known about youth's differential responses to life event stress, or protective processes and coping strategies for…
Omira, Rachid; Baptista, Maria Ana; Matias, Luis
2015-04-01
This study constitutes the first assessment of probabilistic tsunami inundation in the NE Atlantic region, using an event-tree approach. It aims to develop a probabilistic tsunami inundation approach for the NE Atlantic coast with an application to two test sites of ASTARTE project, Tangier-Morocco and Sines-Portugal. Only tsunamis of tectonic origin are considered here, taking into account near-, regional- and far-filed sources. The multidisciplinary approach, proposed here, consists of an event-tree method that gathers seismic hazard assessment, tsunami numerical modelling, and statistical methods. It presents also a treatment of uncertainties related to source location and tidal stage in order to derive the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height during a given return period. We derive high-resolution probabilistic maximum wave heights and flood distributions for both test-sites Tangier and Sines considering 100-, 500-, and 1000-year return periods. We find that the probability that a maximum wave height exceeds 1 m somewhere along the Sines coasts reaches about 55% for 100-year return period, and is up to 100% for 1000-year return period. Along Tangier coast, the probability of inundation occurrence (flow depth > 0m) is up to 45% for 100-year return period and reaches 96% in some near-shore costal location for 500-year return period. Acknowledgements: This work is funded by project ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe. Grant 603839, 7th FP (ENV.2013.6.4-3 ENV.2013.6.4-3).
Omira, R.; Matias, L.; Baptista, M. A.
2016-12-01
This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.
Omira, R.; Matias, L.; Baptista, M. A.
2016-08-01
This study constitutes a preliminary assessment of probabilistic tsunami inundation in the NE Atlantic region. We developed an event-tree approach to calculate the likelihood of tsunami flood occurrence and exceedance of a specific near-shore wave height for a given exposure time. Only tsunamis of tectonic origin are considered here, taking into account local, regional, and far-field sources. The approach used here consists of an event-tree method that gathers probability models for seismic sources, tsunami numerical modeling, and statistical methods. It also includes a treatment of aleatoric uncertainties related to source location and tidal stage. Epistemic uncertainties are not addressed in this study. The methodology is applied to the coastal test-site of Sines located in the NE Atlantic coast of Portugal. We derive probabilistic high-resolution maximum wave amplitudes and flood distributions for the study test-site considering 100- and 500-year exposure times. We find that the probability that maximum wave amplitude exceeds 1 m somewhere along the Sines coasts reaches about 60 % for an exposure time of 100 years and is up to 97 % for an exposure time of 500 years. The probability of inundation occurrence (flow depth >0 m) varies between 10 % and 57 %, and from 20 % up to 95 % for 100- and 500-year exposure times, respectively. No validation has been performed here with historical tsunamis. This paper illustrates a methodology through a case study, which is not an operational assessment.
Lawrence, C.; Lin, L.; Lisiecki, L. E.; Khider, D.
2014-12-01
The broad goal of this presentation is to demonstrate the utility of probabilistic generative models to capture investigators' knowledge of geological processes and proxy data to draw statistical inferences about unobserved paleoclimatological events. We illustrate how this approach forces investigators to be explicit about their assumptions, and about how probability theory yields results that are a mathematical consequence of these assumptions and the data. We illustrate these ideas with the HMM-Match model that infers common times of sediment deposition in two records and the uncertainty in these inferences in the form of confidence bands. HMM-Match models the sedimentation processes that led to proxy data measured in marine sediment cores. This Bayesian model has three components: 1) a generative probabilistic model that proceeds from the underlying geophysical and geochemical events, specifically the sedimentation events to the generation the proxy data Sedimentation ---> Proxy Data ; 2) a recursive algorithm that reverses the logic of the model to yield inference about the unobserved sedimentation events and the associated alignment of the records based on proxy data Proxy Data ---> Sedimentation (Alignment) ; 3) an expectation maximization algorithm for estimating two unknown parameters. We applied HMM-Match to align 35 Late Pleistocene records to a global benthic d18Ostack and found that the mean width of 95% confidence intervals varies between 3-23 kyr depending on the resolution and noisiness of the core's d18O signal. Confidence bands within individual cores also vary greatly, ranging from ~0 to >40 kyr. Results from this algorithm will allow researchers to examine the robustness of their conclusions with respect to alignment uncertainty. Figure 1 shows the confidence bands for one low resolution record.
Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar
2017-02-01
Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.
Energy Technology Data Exchange (ETDEWEB)
Pfister, A.; Goossen, C.; Coogler, K.; Gorgemans, J. [Westinghouse Electric Company LLC, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States)
2012-07-01
Both the International Atomic Energy Agency (IAEA) and the U.S. Nuclear Regulatory Commission (NRC) require existing and new nuclear power plants to conduct plant assessments to demonstrate the unit's ability to withstand external hazards. The events that occurred at the Fukushima-Dai-ichi nuclear power station demonstrated the importance of designing a nuclear power plant with the ability to protect the plant against extreme external hazards. The innovative design of the AP1000{sup R} nuclear power plant provides unparalleled protection against catastrophic external events which can lead to extensive infrastructure damage and place the plant in an extended abnormal situation. The AP1000 plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance and safety. The plant's compact safety related footprint and protection provided by its robust nuclear island structures prevent significant damage to systems, structures, and components required to safely shutdown the plant and maintain core and spent fuel pool cooling and containment integrity following extreme external events. The AP1000 nuclear power plant has been extensively analyzed and reviewed to demonstrate that it's nuclear island design and plant layout provide protection against both design basis and extreme beyond design basis external hazards such as extreme seismic events, external flooding that exceeds the maximum probable flood limit, and malicious aircraft impact. The AP1000 nuclear power plant uses fail safe passive features to mitigate design basis accidents. The passive safety systems are designed to function without safety-grade support systems (such as AC power, component cooling water, service water, compressed air or HVAC). The plant has been designed to protect systems, structures, and components critical to placing the reactor in a safe shutdown condition within the steel
Kreimeyer, Kory; Menschik, David; Winiecki, Scott; Paul, Wendy; Barash, Faith; Woo, Emily Jane; Alimchandani, Meghna; Arya, Deepa; Zinderman, Craig; Forshee, Richard; Botsis, Taxiarchis
2017-07-01
Duplicate case reports in spontaneous adverse event reporting systems pose a challenge for medical reviewers to efficiently perform individual and aggregate safety analyses. Duplicate cases can bias data mining by generating spurious signals of disproportional reporting of product-adverse event pairs. We have developed a probabilistic record linkage algorithm for identifying duplicate cases in the US Vaccine Adverse Event Reporting System (VAERS) and the US Food and Drug Administration Adverse Event Reporting System (FAERS). In addition to using structured field data, the algorithm incorporates the non-structured narrative text of adverse event reports by examining clinical and temporal information extracted by the Event-based Text-mining of Health Electronic Records system, a natural language processing tool. The final component of the algorithm is a novel duplicate confidence value that is calculated by a rule-based empirical approach that looks for similarities in a number of criteria between two case reports. For VAERS, the algorithm identified 77% of known duplicate pairs with a precision (or positive predictive value) of 95%. For FAERS, it identified 13% of known duplicate pairs with a precision of 100%. The textual information did not improve the algorithm's automated classification for VAERS or FAERS. The empirical duplicate confidence value increased performance on both VAERS and FAERS, mainly by reducing the occurrence of false-positives. The algorithm was shown to be effective at identifying pre-linked duplicate VAERS reports. The narrative text was not shown to be a key component in the automated detection evaluation; however, it is essential for supporting the semi-automated approach that is likely to be deployed at the Food and Drug Administration, where medical reviewers will perform some manual review of the most highly ranked reports identified by the algorithm.
Robbins, J. C.
2016-10-01
Large and numerous landslides can result in widespread impacts which are felt particularly strongly in the largely subsistence-orientated communities residing in the most landslide-prone areas of Papua New Guinea (PNG). Understanding the characteristics of rainfall preceding these landslide events is essential for the development of appropriate early warning systems and forecasting models. Relationships between rainfall and landslides are frequently complex and uncertainties tend to be amplified by inconsistent and incomplete landslide catalogues and sparse rainfall data availability. To address some of these uncertainties a modified Bayesian technique has been used, in conjunction with the multiple time frames method, to produce thresholds of landslide probability associated with rainfall events of specific magnitude and duration. Satellite-derived precipitation estimates have been used to derive representative rainfall accumulations and intensities over a range of different rainfall durations (5, 10, 15, 30, 45, 60, 75 and 90 days) for rainfall events which resulted in landslides and those which did not result in landslides. Of the two parameter combinations (accumulation-duration and intensity-duration) analysed, rainfall accumulation and duration provide the best scope for identifying probabilistic thresholds for use in landslide warning and forecasting in PNG. Analysis of historical events and rainfall characteristics indicates that high accumulation (>250 mm), shorter duration (75 days), high accumulation (>1200 mm) rainfall events are more likely to lead to moderate- to high-impact landslides. This analysis has produced the first proxy probability thresholds for landslides in PNG and their application within an early warning framework has been discussed.
Apical External Root Resorption and Repair in Orthodontic Tooth Movement: Biological Events
Directory of Open Access Journals (Sweden)
Liviu Feller
2016-01-01
Full Text Available Some degree of external root resorption is a frequent, unpredictable, and unavoidable consequence of orthodontic tooth movement mediated by odontoclasts/cementoclasts originating from circulating precursor cells in the periodontal ligament. Its pathogenesis involves mechanical forces initiating complex interactions between signalling pathways activated by various biological agents. Resorption of cementum is regulated by mechanisms similar to those controlling osteoclastogenesis and bone resorption. Following root resorption there is repair by cellular cementum, but factors mediating the transition from resorption to repair are not clear. In this paper we review some of the biological events associated with orthodontically induced external root resorption.
New Applications of Relational Event Algebra to Fuzzy Quantification and Probabilistic Reasoning
2002-04-30
see from matter & 2002 Elsevier Science Inc . All rights reserved. I’ll: S0020-0255(02)00279-7 20090803041 88 /. R. Goodman el al. I Information...Sciences 14X (2002) 87-96 boolean conditional events is obtained, compatible with the above improved consistency criterion. © 2002 Elsevier Science Inc . All
A Probabilistic Framework for Risk Analysis of Widespread Flood Events: A Proof-of-Concept Study.
Schneeberger, Klaus; Huttenlau, Matthias; Winter, Benjamin; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann
2017-07-27
This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top-kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof-of-concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately. © 2017 Society for Risk Analysis.
3D Simulation of External Flooding Events for the RISMC Pathway
Energy Technology Data Exchange (ETDEWEB)
Prescott, Steven [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sampath, Ramprasad [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Lin, Linyu [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to the design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.
Energy Technology Data Exchange (ETDEWEB)
Rubin, A.M.; Chen, J.T.; Chokshi, N. [Nuclear Regulatory Commission, Washington, DC (United States); Nowlen, S.P.; Bohn, M.P. [Sandia National Labs., Albuquerque, NM (United States); Sewell, R.; Kazarians, M.; Lambright, J. [Energy Research, Inc., Rockville, MD (United States)
1998-03-01
As a result of the US Nuclear Regulatory Commission (USNRC) initiated Individual Plant Examination of External Events (IPEEE) program, virtually every operating commercial nuclear power reactor in the US has performed an assessment of severe accident risk due to external events. To date, the USNRC staff has received 63 IPEEE submittals and will receive an additional 11 by mid 1998. Currently, 49 IPEEE submittals are under various stages ore view. This paper is based on the information available for those 41 plants for which at least preliminary Technical Evaluation Reports have been prepared by the review teams. The goal of the review is to ascertain whether the licensee`s IPEEE process is capable of identifying external events-induced severe accident vulnerabilities and cost-effective safety improvements to either eliminate or reduce the impact of these vulnerabilities. The review does not, however, attempt to validate or verify the results of the licensee`s IPEEE. The primary objective of this paper is to provide an update on the preliminary perspectives and insights gained from the IPEEE process.
Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis
Energy Technology Data Exchange (ETDEWEB)
Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others
2003-07-01
This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.
Gupta, Rahul; Audhkhasi, Kartik; Lee, Sungbok; Narayanan, Shrikanth
2017-01-01
Non-verbal communication involves encoding, transmission and decoding of non-lexical cues and is realized using vocal (e.g. prosody) or visual (e.g. gaze, body language) channels during conversation. These cues perform the function of maintaining conversational flow, expressing emotions, and marking personality and interpersonal attitude. In particular, non-verbal cues in speech such as paralanguage and non-verbal vocal events (e.g. laughters, sighs, cries) are used to nuance meaning and convey emotions, mood and attitude. For instance, laughters are associated with affective expressions while fillers (e.g. um, ah, um) are used to hold floor during a conversation. In this paper we present an automatic non-verbal vocal events detection system focusing on the detect of laughter and fillers. We extend our system presented during Interspeech 2013 Social Signals Sub-challenge (that was the winning entry in the challenge) for frame-wise event detection and test several schemes for incorporating local context during detection. Specifically, we incorporate context at two separate levels in our system: (i) the raw frame-wise features and, (ii) the output decisions. Furthermore, our system processes the output probabilities based on a few heuristic rules in order to reduce erroneous frame-based predictions. Our overall system achieves an Area Under the Receiver Operating Characteristics curve of 95.3% for detecting laughters and 90.4% for fillers on the test set drawn from the data specifications of the Interspeech 2013 Social Signals Sub-challenge. We perform further analysis to understand the interrelation between the features and obtained results. Specifically, we conduct a feature sensitivity analysis and correlate it with each feature's stand alone performance. The observations suggest that the trained system is more sensitive to a feature carrying higher discriminability with implications towards a better system design. PMID:28713197
Biass, Sebastien; Bonadonna, Costanza; di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino
2016-05-01
A first probabilistic scenario-based hazard assessment for tephra fallout is presented for La Fossa volcano (Vulcano Island, Italy) and subsequently used to assess the impact on the built environment. Eruption scenarios are based upon the stratigraphy produced by the last 1000 years of activity at Vulcano and include long-lasting Vulcanian and sub-Plinian eruptions. A new method is proposed to quantify the evolution through time of the hazard associated with pulsatory Vulcanian eruptions lasting from weeks to years, and the increase in hazard related to typical rainfall events around Sicily is also accounted for. The impact assessment on the roofs is performed by combining a field characterization of the buildings with the composite European vulnerability curves for typical roofing stocks. Results show that a sub-Plinian eruption of VEI 2 is not likely to affect buildings, whereas a sub-Plinian eruption of VEI 3 results in 90 % of the building stock having a ≥12 % probability of collapse. The hazard related to long-lasting Vulcanian eruptions evolves through time, and our analysis shows that the town of Il Piano, located downwind of the preferential wind patterns, is likely to reach critical tephra accumulations for roof collapse 5-9 months after the onset of the eruption. If no cleaning measures are taken, half of the building stock has a probability >20 % of suffering roof collapse.
OVERVIEW OF THE ACTIVITIES OF THE NUCLEAR ENERGY AGENCY WORKING GROUP ON EXTERNAL EVENTS
Energy Technology Data Exchange (ETDEWEB)
Nakoski, John A.; Smith, Curtis L.; Kim, Min Kyu
2016-10-01
The Orgranisation for Economic Cooperation and Development (OECD) Nuclear Energy Agency (NEA) has established a Working Group on External Events (WGEV) that provides a forum for subject matter experts from the nuclear industry and regulators to improve the understanding and treatment of external hazards that would support the continued safety performance of nuclear installations, and improve the effectiveness of regulatory practices, in NEA member countries. This report provides a description of the ongoing work of the WGEV. The work of the WGEV includes the collection of information and conducting a workshop on severe weather and storm surge that brought together a diverse group of subject matter experts to identify commendable practices related to the treatment of severe weather and storm surge consideration in regulatory and operational decision-making. Other work of the WGEV includes looking at science-based screening of external events that are factored into decisions on the safe operation of nuclear facilities; and identification of commendable practices and knowledge gaps on riverine flooding.
iROCS: Integrated accident management framework for coping with beyond-design-basis external events
Energy Technology Data Exchange (ETDEWEB)
Kim, Jaewhan; Park, Soo-Yong; Ahn, Kwang-Il, E-mail: kiahn@kaeri.re.kr; Yang, Joon-Eon
2016-03-15
Highlights: • An integrated mitigating strategy to cope with extreme external events, iROCS, is proposed. • The strategy aims to preserve the integrity of the reactor vessel as well as core cooling. • A case study for an extreme damage state is performed to assess the effectiveness and feasibility of candidate mitigation strategies under an extreme event. - Abstract: The Fukushima Daiichi accident induced by the Great East Japan earthquake and tsunami on March 11, 2011, poses a new challenge to the nuclear society, especially from an accident management viewpoint. This paper presents a new accident management framework called an integrated, RObust Coping Strategy (iROCS) to cope with beyond-design-basis external events (BDBEEs). The iROCS approach is characterized by classification of various plant damage conditions (PDCs) that might be impacted by BDBEEs and corresponding integrated coping strategies for each of PDCs, aiming to maintain and restore core cooling (i.e., to prevent core damage) and to maintain the integrity of the reactor pressure vessel if it is judged that core damage may not be preventable in view of plant conditions. From a case study for an extreme damage condition, it showed that candidate accident management strategies should be evaluated from the viewpoint of effectiveness and feasibility against accident scenarios and extreme damage conditions of the site, especially when employing mobile or portable equipment under BDBEEs within the limited time available to achieve desired goals such as prevention of core damage as well as a reactor vessel failure.
Liu, Chuang; Zhan, Xiu-Xiu; Zhang, Zi-Ke; Sun, Gui-Quan; Hui, Pak Ming
2015-11-01
Recently, information transmission models motivated by the classical epidemic propagation, have been applied to a wide-range of social systems, generally assume that information mainly transmits among individuals via peer-to-peer interactions on social networks. In this paper, we consider one more approach for users to get information: the out-of-social-network influence. Empirical analyzes of eight typical events’ diffusion on a very large micro-blogging system, Sina Weibo, show that the external influence has significant impact on information spreading along with social activities. In addition, we propose a theoretical model to interpret the spreading process via both internal and external channels, considering three essential properties: (i) memory effect; (ii) role of spreaders; and (iii) non-redundancy of contacts. Experimental and mathematical results indicate that the information indeed spreads much quicker and broader with mutual effects of the internal and external influences. More importantly, the present model reveals that the event characteristic would highly determine the essential spreading patterns once the network structure is established. The results may shed some light on the in-depth understanding of the underlying dynamics of information transmission on real social networks.
March-Llanes, Jaume; Marqués-Feixa, Laia; Mezquita, Laura; Fañanás, Lourdes; Moya-Higueras, Jorge
2017-05-13
The main objective of the present research was to analyze the relations between stressful life events and the externalizing and internalizing spectra of psychopathology using meta-analytical procedures. After removing the duplicates, a total of 373 papers were found in a literature search using several bibliographic databases, such as the PsycINFO, Medline, Scopus, and Web of Science. Twenty-seven studies were selected for the meta-analytical analysis after applying different inclusion and exclusion criteria in different phases. The statistical procedure was performed using a random/mixed-effects model based on the correlations found in the studies. Significant positive correlations were found in cross-sectional and longitudinal studies. A transactional effect was then found in the present study. Stressful life events could be a cause, but also a consequence, of psychopathological spectra. The level of controllability of the life events did not affect the results. Special attention should be given to the usage of stressful life events in gene-environment interaction and correlation studies, and also for clinical purposes.
Directory of Open Access Journals (Sweden)
S. Rolinski
2014-06-01
Full Text Available Extreme meteorological events are most likely to occur more often with climate change, leading to a further acceleration of climate change through potentially devastating effects on terrestrial ecosystems. But not all extreme meteorological events lead to extreme ecosystem response. Unlike most current studies, we therefore focus on pre-defined hazardous ecosystem behaviour and the identification of coinciding meteorological conditions, instead of expected ecosystem damage for a pre-defined meteorological event. We use a simple probabilistic risk assessment based on time series of ecosystem behaviour and meteorological conditions. Given the risk assessment terminology, vulnerability and risk for the previously defined hazard are, thus, estimated on the basis of observed hazardous ecosystem behaviour. We first adapt this generic approach to extreme responses of terrestrial ecosystems to drought and high temperatures, with defining the hazard as a negative net biome productivity over a 12 months period. Further, we show an instructive application for two selected sites using data for 1981–2010; and then apply the method on pan-European scale addressing the 1981–2010 period and future projections for 2071–2100, both based on numerical modelling results (LPJmL for ecosystem behaviour; REMO-SRES A1B for climate. Our site-specific results demonstrate the applicability of the proposed method, using the SPEI index to describe the meteorological condition. They also provide examples for their interpretation in case of vulnerability to drought for Spain with the expected value of the SPEI being 0.4 lower for hazardous than for non-hazardous ecosystem behaviour, and of non-vulnerability for Northern Germany, where the expected drought index value for hazard observations relates to wetter conditions than for the non-hazard observations. The pan-European assessment shows that significant results could be obtained for large areas within Europe. For 2071
Morgeson, Frederick P
2005-05-01
Relatively little empirical research has been conducted on external leaders of self-managing teams. The integration of functional leadership theory with research on team routines suggests that leaders can intervene in teams in several different ways, and the effectiveness of this intervention depends on the nature of the events the team encounters. External team leaders from 3 organizations first described a series of events (N=117), and leaders and team members then completed surveys to quantitatively describe the events. Results indicated that leader preparation and supportive coaching were positively related to team perceptions of leader effectiveness, with preparation becoming more strongly related to effectiveness as event novelty increased. More active leader intervention activities (active coaching and sense making) were negatively related to satisfaction with leadership yet were positively related to effectiveness as events became more disruptive.
Energy Technology Data Exchange (ETDEWEB)
Kang, D. J.; Kim, K. Y.; Yang, J. E
2001-03-01
In this study, for the major safety systems of Ulchin Units 3/4, we quantify the risk on the change of AOT and the PM during power operation to identify the effects on the results of external events PSA when nuclear power plant changes such as allowed outage time are requested. The systems for which the risks on the change of allowed outage time are quantified are High Pressure Safety Injection System (HPSIS), Containment Spray System (CSS), and Emergency Diesel Generator (EDG). The systems for which the risks on the PM during power operation are Low Pressure Safety Injection System (LPSIS), CSS, EDG, Essential Service Water System (ESWS). Following conclusions can be obtained through this study: 1)The increase of core damage frequency ({delta}CDF) on the change of AOT and the conditional core damage probability (CCDP) on the on-line PM of each system are differently quantified according to the cases of considering only internal events or only external events. . 2)It is expected that the quantification of risk including internal and external events is advantageous for the licensee of NPP if the regulatory acceptance criteria for the technical specification changes are relatively set up. However, it is expected to be disadvantageous for the licensee if the acceptance criteria are absolutely set up. 3)It is expected that the conduction on the quantification of only a fire event is sufficient when the quantification of external events PSA model is required for the plant changes of Korea Standard NPPs. 4)It is expected that the quantification of the increase of core damage frequency and the incremental conditional core damage probability on technical specification changes are not needed if the quantification results of those considering only internal events are below regulatory acceptance criteria and the external events PSA results are not greatly affected by the system availability. However, it is expected that the quantification of risk considering external events
Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim
2013-06-01
In this study, we focus on a three-level meta-analysis for combining data from studies using multiple-baseline across-participants designs. A complicating factor in such designs is that results might be biased if the dependent variable is affected by not explicitly modeled external events, such as the illness of a teacher, an exciting class activity, or the presence of a foreign observer. In multiple-baseline designs, external effects can become apparent if they simultaneously have an effect on the outcome score(s) of the participants within a study. This study presents a method for adjusting the three-level model to external events and evaluates the appropriateness of the modified model. Therefore, we use a simulation study, and we illustrate the new approach with real data sets. The results indicate that ignoring an external event effect results in biased estimates of the treatment effects, especially when there is only a small number of studies and measurement occasions involved. The mean squared error, as well as the standard error and coverage proportion of the effect estimates, is improved with the modified model. Moreover, the adjusted model results in less biased variance estimates. If there is no external event effect, we find no differences in results between the modified and unmodified models.
Energy Technology Data Exchange (ETDEWEB)
Canadell, F.; Aleman, A.; Beltran, F.; Pifarre, D.; Hernandez, H.; Gasca, C.
2011-07-01
Within the process of maintaining and updating the risk analysis of the NPP Asco, results from the review of the vulnerability study of the plant against severe accidents caused by external success (Individual Plant Examination of External Events, IPEEE).
Zhou, Qing; Wang, Yun; Deng, Xianli; Eisenberg, Nancy; Wolchik, Sharlene A.; Tein, Jenn-Yun
2008-01-01
The relations of parenting and temperament (effortful control and anger/frustration) to children's externalizing problems were examined in a 3.8-year longitudinal study of 425 native Chinese children (6-9 years) from Beijing. Children's experience of negative life events and coping efficacy were examined as mediators in the parenting- and…
Directory of Open Access Journals (Sweden)
J. Dietrich
2008-03-01
Full Text Available Flood forecasts are essential to issue reliable flood warnings and to initiate flood control measures on time. The accuracy and the lead time of the predictions for head waters primarily depend on the meteorological forecasts. Ensemble forecasts are a means of framing the uncertainty of the potential future development of the hydro-meteorological situation.
This contribution presents a flood management strategy based on probabilistic hydrological forecasts driven by operational meteorological ensemble prediction systems. The meteorological ensemble forecasts are transformed into discharge ensemble forecasts by a rainfall-runoff model. Exceedance probabilities for critical discharge values and probabilistic maps of inundation areas can be computed and presented to decision makers. These results can support decision makers in issuing flood alerts. The flood management system integrates ensemble forecasts with different spatial resolution and different lead times. The hydrological models are controlled in an adaptive way, mainly depending on the lead time of the forecast, the expected magnitude of the flood event and the availability of measured data.
The aforementioned flood forecast techniques have been applied to a case study. The Mulde River Basin (South-Eastern Germany, Czech Republic has often been affected by severe flood events including local flash floods. Hindcasts for the large scale extreme flood in August 2002 have been computed using meteorological predictions from both the COSMO-LEPS ensemble prediction system and the deterministic COSMO-DE local model. The temporal evolution of a the meteorological forecast uncertainty and b the probability of exceeding flood alert levels is discussed. Results from the hindcast simulations demonstrate, that the systems would have predicted a high probability of an extreme flood event, if they would already have been operational in 2002. COSMO-LEPS showed a reasonably good
Energy Technology Data Exchange (ETDEWEB)
Martins, Eduardo Ferraz
2015-04-01
The study projects in highly complex installations involves robust modeling, supported by conceptual and mathematical tools, to carry out systematic research and structured the different risk scenarios that can lead to unwanted events from occurring equipment failures or human errors. In the context of classical modeling, the Probabilistic Safety Analysis (PSA) seeks to provide qualitative and quantitative information about the project particularity and their operational facilities, including the identification of factors or scenarios that contribute to the risk and consequent comparison options for increasing safety. In this context, the aim of the thesis is to develop a hybrid instrument (CPP-HI) innovative, from the integrated modeling techniques of Failure Mode and Effect Analysis (FMEA), concepts of Human Reliability Analysis and Probabilistic Composition of Preferences (PCP). In support of modeling and validation of the CPP-HI, a simulation was performed on a triggering event 'Loss of External Electric Power' - PEEE, in a Nuclear Power plant. The results were simulated in a virtual environment (sensitivity analysis) and are robust to the study of Human Reliability Analysis (HRA) in the context of the PSA. (author)
Wakker, P. P.; Thaler, R.H.; Tversky, A.
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...
Wakker, P.P.; Thaler, R.H.; Tversky, A.
1997-01-01
Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be
P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these
DEFF Research Database (Denmark)
Jensen, Finn Verner; Lauritzen, Steffen Lilholt
2001-01-01
This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....
P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these pref
Energy Technology Data Exchange (ETDEWEB)
Aleman, A.; Canadell, F.; Beltran, F.; Pifarre, D.; Hernandez, H.; Gasca, C.
2012-07-01
During the risk analysis update of Asco PP (2010), it has been carried out a review of the vulnerabilities against severe accidents caused by external events (individual Plant Examination of external Events, IPEEE). The assessment has included analysis of accidents in industrial and military facilities nearby and transportation accidents (i.e., rail, road and aircraft impact) release of hazardous materials on site, external flooding, turbine missiles and strong winds. (Author)
Common Difficulties with Probabilistic Reasoning.
Hope, Jack A.; Kelly, Ivan W.
1983-01-01
Several common errors reflecting difficulties in probabilistic reasoning are identified, relating to ambiguity, previous outcomes, sampling, unusual events, and estimating. Knowledge of these mistakes and interpretations may help mathematics teachers understand the thought processes of their students. (MNS)
Mera, R. J.; Allen, M. R.; Mote, P.; Ekwurzel, B.; Frumhoff, P. C.; Rupp, D. E.
2015-12-01
Heat waves in the western US have become progressively more severe due to increasing relative humidity and nighttime temperatures, increasing the health risks of vulnerable portions of the population, including Latino farmworkers in California's Central Valley and other socioeconomically disadvantaged communities. Recent research has shown greenhouse gas emissions doubled the risk of the hottest summer days during the 2000's in the Central Valley, increasing public health risks and costs, and raising the question of which parties are responsible for paying these costs. It has been argued that these costs should not be taken up solely by the general public through taxation, but that additional parties can be considered, including multinational corporations who have extracted and marketed a large proportion of carbon-based fuels. Here, we apply probabilistic event attribution (PEA) to assess the contribution of emissions traced to the world's 90 largest major industrial carbon producers to the severity and frequency of these extreme heat events. Our research uses very large ensembles of regional climate model simulations to calculate fractional attribution of policy-relevant extreme heat variables. We compare a full forcings world with observed greenhouse gases, sea surface temperatures and sea ice extent to a counter-factual world devoid of carbon pollution from major industrial carbon producers. The results show a discernable fraction of record-setting summer temperatures in the western US during the 2000's can be attributed to emissions sourced from major carbon producers.
Rusgiyarto, Ferry; Sjafruddin, Ade; Frazila, Russ Bona; Suprayogi
2017-06-01
Increasing container traffic and land acquisition problem for terminal expansion leads to usage of external yard in a port buffer area. This condition influenced the terminal performance because a road which connects the terminal and the external yard was also used by non-container traffic. Location choice problem considered to solve this condition, but the previous research has not taken account a stochastic condition of container arrival rate and service time yet. Bi-level programming framework was used to find optimum location configuration. In the lower-level, there was a problem to construct the equation, which correlated the terminal operation and the road due to different time cycle equilibrium. Container moves from the quay to a terminal gate in a daily unit of time, meanwhile, it moves from the terminal gate to the external yard through the road in a minute unit of time. If the equation formulated in hourly unit equilibrium, it cannot catch up the container movement characteristics in the terminal. Meanwhile, if the equation formulated in daily unit equilibrium, it cannot catch up the road traffic movement characteristics in the road. This problem can be addressed using simulation model. Discrete Event Simulation Model was used to simulate import container flow processes in the container terminal and external yard. Optimum location configuration in the upper-level was the combinatorial problem, which was solved by Full Enumeration approach. The objective function of the external yard location model was to minimize user transport cost (or time) and to maximize operator benefit. Numerical experiment was run for the scenario assumption of two container handling ways, three external yards, and thirty-day simulation periods. Jakarta International Container Terminal (JICT) container characteristics data was referred for the simulation. Based on five runs which were 5, 10, 15, 20, and 30 repetitions, operation one of three available external yards (external yard
Suciu, Dan; Koch, Christop
2011-01-01
Probabilistic databases are databases where the value of some attributes or the presence of some records are uncertain and known only with some probability. Applications in many areas such as information extraction, RFID and scientific data management, data cleaning, data integration, and financial risk assessment produce large volumes of uncertain data, which are best modeled and processed by a probabilistic database. This book presents the state of the art in representation formalisms and query processing techniques for probabilistic data. It starts by discussing the basic principles for rep
Directory of Open Access Journals (Sweden)
Annekathrin Schacht
Full Text Available BACKGROUND: A crucial question for understanding sentence comprehension is the openness of syntactic and semantic processes for other sources of information. Using event-related potentials in a dual task paradigm, we had previously found that sentence processing takes into consideration task relevant sentence-external semantic but not syntactic information. In that study, internal and external information both varied within the same linguistic domain-either semantic or syntactic. Here we investigated whether across-domain sentence-external information would impact within-sentence processing. METHODOLOGY: In one condition, adjectives within visually presented sentences of the structure [Det]-[Noun]-[Adjective]-[Verb] were semantically correct or incorrect. Simultaneously with the noun, auditory adjectives were presented that morphosyntactically matched or mismatched the visual adjectives with respect to gender. FINDINGS: As expected, semantic violations within the sentence elicited N400 and P600 components in the ERP. However, these components were not modulated by syntactic matching of the sentence-external auditory adjective. In a second condition, syntactic within-sentence correctness-variations were combined with semantic matching variations between the auditory and the visual adjective. Here, syntactic within-sentence violations elicited a LAN and a P600 that did not interact with semantic matching of the auditory adjective. However, semantic mismatching of the latter elicited a frontocentral positivity, presumably related to an increase in discourse level complexity. CONCLUSION: The current findings underscore the open versus algorithmic nature of semantic and syntactic processing, respectively, during sentence comprehension.
Energy Technology Data Exchange (ETDEWEB)
Alonso Martin, J. I.; Setien, L. M.; Ayllon, J. C.
2011-07-01
In the realization of Probabilistic Safety Assessment, the frequencies of initiating events generally have been getting from generic data (usually international databases), operational experience of each plant, or from a combination between generic and specific data using Bayesian analysis. These frequencies obtained as a numerical data, are multiplied by the probability of system failure relievers, obtaining the frequency of each Fault Minimum Set which leads to damage to the core.
Douven, Igor; Horsten, Leon; Romeijn, Jan-Willem
2010-01-01
Until now, antirealists have offered sketches of a theory of truth, at best. In this paper, we present a probabilist account of antirealist truth in some formal detail, and we assess its ability to deal with the problems that are standardly taken to beset antirealism.
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Burcharth, H. F.
This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...
Bod, R.; Heine, B.; Narrog, H.
2010-01-01
Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter
Lorito, Stefano; Selva, Jacopo; Basili, Roberto; Romano, Fabrizio; Tiberti, Mara Monica; Piatanesi, Alessio
2014-05-01
Probabilistic Tsunami Hazard Analysis (PTHA) rests on computationally demanding numerical simulations of the tsunami generation and propagation up to the inundated coastline. We here focus on tsunamis generated by the co-seismic sea floor displacement, which constitute the vast majority of the observed tsunami events, i.e. on Seismic PTHA (SPTHA). For incorporating the full expected seismic source variability, aiming at a complete SPTHA, a very large number of numerical tsunami scenarios is typically needed, especially for complex tectonic contexts, where SPTHA is not dominated by large subduction earthquakes only. Here, we propose a viable approach for reducing the number of simulations for a given set of input earthquakes representing the modelled aleatory uncertainties of the seismic rupture parameters. Our approach is based on a preliminary analysis of the SPTHA of maximum offshore wave height (HMax) at a given target location, and assuming computationally cheap linear propagation. We start with defining an operational SPTHA framework in which we then introduce a simplified Event Tree approach, combined with a Green's functions approach, for obtaining a first controlled sampling and reduction of the effective source parameter space size. We then apply a two-stage filtering procedure to the 'linear' SPTHA results. The first filter identifies and discards all the sources producing a negligible contribution at the target location, for example the smallest earthquakes or those directing most of tsunami energy elsewhere. The second filter performs a cluster analysis aimed at selecting groups of source parameters producing comparable HMax profiles for each earthquake magnitude at the given test site. We thus select a limited set of sources that is subsequently used for calculating 'nonlinear' probabilistic inundation maps at the target location. We find that the optimal subset of simulations needed for inundation calculations can be obtained basing on just the
Takahara, Shogo; Iijima, Masashi; Yoneda, Minoru; Shimada, Yoko
2017-09-08
Dose assessment is an important issue from the viewpoints of protecting people from radiation exposure and managing postaccident situations adequately. However, the radiation doses received by people cannot be determined with complete accuracy because of the uncertainties and the variability associated with any process of defining individual characteristics and in the dose assessment process itself. In this study, a dose assessment model was developed based on measurements and surveys of individual doses and relevant contributors (i.e., ambient dose rates and behavior patterns) in Fukushima City for four population groups: Fukushima City Office staff, Senior Citizens' Club, Contractors' Association, and Agricultural Cooperative. In addition, probabilistic assessments were performed for these population groups by considering the spatial variability of contamination and interpopulation differences resulting from behavior patterns. As a result of comparison with the actual measurements, the assessment results for participants from the Fukushima City Office agreed with the measured values, thereby validating the model and the approach. Although the assessment results obtained for the Senior Citizens' Club and the Agricultural Cooperative differ partly from the measured values, by addressing further considerations in terms of dose reduction effects due to decontamination and the impact of additional exposure sources in agricultural fields, these results can be improved. By contrast, the measurements obtained for the participants from the Contractors' Association were not reproduced well in the present study. To assess the doses to this group, further investigations of association members' work activities and the related dose reduction effects are needed. © 2017 Society for Risk Analysis.
Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji
2016-04-01
Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.
Energy Technology Data Exchange (ETDEWEB)
Kim, Jaewhan; Ahn, Kwang Il [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Inn Seock [ISSA Technology, Inc., Germantown (United States)
2013-10-15
The proposed integrated coping strategies include operation strategies for specific accident conditions, extension or revision of emergency operating procedures (EOPs), integration between EOPs and severe accident mitigation guidelines (SAMG), and so on. This paper reviewed mitigation strategies adopted in the U. S. EDMG and FLEX approaches, and then proposed extended coping strategies for BDBEEs. The extended coping strategies provide comprehensive mitigation approach including restoration of the RCS inventory and pressure control as well as mitigation strategies of the U. S. EDMG and FLEX. More detailed strategies will be developed in the near future following an evaluation of the various accident mitigation strategies being implemented worldwide in the aftermath of the Fukushima accident. An extended loss of all AC power occurred at the Fukushima Daiichi nuclear power plant, on March 11, 2011, by a large earthquake and subsequent tsunamis. This event led to loss of reactor core cooling and containment integrity functions at several units of the site, ultimately resulting in large release of radioactive materials into the environment. Extreme events, or beyond-design-basis external events (BDBEEs), as occurred in the Fukushima Daiichi plant, may threaten plant safety by disabling critical safety functions of nuclear power plants for an extended period. Therefore, coping strategies need to be developed to further enhance nuclear safety by maintaining or restoring core cooling and containment integrity for BDBEEs. This paper reviews the U. S. EDMG and FLEX approaches from the perspective of coping strategies, and proposes an integrated strategic approach to cope with BDBEEs by extending the concepts of EDMG and FLEX.
Schmidt, F.; Liu, S.
2016-12-01
Source water quality plays an important role for the safety of drinking water and early detection of its contamination is vital to taking appropriate countermeasures. However, compared to drinking water, it is more difficult to detect contamination events because its environment is less controlled and numerous natural causes contribute to a high variability of the background values. In this project, Artificial Neural Networks (ANNs) and a Contamination Event Detection Process (CED Process) were used to identify events in river water. The ANN models the response of basic water quality sensors obtained in laboratory experiments in an off-line learning stage and continuously forecasts future values of the time line in an on-line forecasting step. During this second stage, the CED Process compares the forecast to the measured value and classifies it as regular background or event value, which modifies the ANN's continuous learning and influences its forecasts. In addition to this basic setup, external information is fed to the CED Process: A so-called Operator Input (OI) is provided to inform about unusual water quality levels that are unrelated to the presence of contamination, for example due to cooling water discharge from a nearby power plant. This study's primary goal is to evaluate how well the OI fits into the design of the combined forecasting ANN and CED Process and to understand its effects on the online forecasting stage. To test this, data from laboratory experiments conducted previously at the School of Environment, Tsinghua University, have been used to perform simulations highlighting features and drawbacks of this method. Applying the OI has been shown to have a positive influence on the ANN's ability to handle a sudden change in background values, which is unrelated to contamination. However, it might also mask the presence of an event, an issue that underlines the necessity to have several instances of the algorithm run in parallel. Other difficulties
Diaz, Julia M.; Hansel, Colleen M.; Apprill, Amy; Brighi, Caterina; Zhang, Tong; Weber, Laura; McNally, Sean; Xun, Liping
2016-12-01
The reactive oxygen species superoxide (O2.-) is both beneficial and detrimental to life. Within corals, superoxide may contribute to pathogen resistance but also bleaching, the loss of essential algal symbionts. Yet, the role of superoxide in coral health and physiology is not completely understood owing to a lack of direct in situ observations. By conducting field measurements of superoxide produced by corals during a bleaching event, we show substantial species-specific variation in external superoxide levels, which reflect the balance of production and degradation processes. Extracellular superoxide concentrations are independent of light, algal symbiont abundance and bleaching status, but depend on coral species and bacterial community composition. Furthermore, coral-derived superoxide concentrations ranged from levels below bulk seawater up to ~120 nM, some of the highest superoxide concentrations observed in marine systems. Overall, these results unveil the ability of corals and/or their microbiomes to regulate superoxide in their immediate surroundings, which suggests species-specific roles of superoxide in coral health and physiology.
Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting
Energy Technology Data Exchange (ETDEWEB)
Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2013-09-01
During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modeling and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.
Directory of Open Access Journals (Sweden)
Mikaël Cozic
2016-11-01
Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.
Development of Quantitative Framework for Event Significance Evaluation
Energy Technology Data Exchange (ETDEWEB)
Lee, Durk Hun; Kim, Min Chull [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kim, Inn Seock [ISSA Technology, Maryland (United States)
2010-10-15
There is an increasing trend in quantitative evaluation of the safety significance of operational events using Probabilistic Safety Assessment (PSA) technique. An integrated framework for evaluation of event significance has been developed by Korea Institute of Nuclear Safety (KINS), which consists of an assessment hierarchy and a number of matrices. The safety significance of various events, e.g., internal or external initiating events that occurred during at-power or shutdown conditions, can be quantitatively analyzed using this framework, and then, the events rated according to their significance. This paper briefly describes the basic concept of the integrated quantitative framework for evaluation of event significance, focusing on the assessment hierarchy
Probabilistic reasoning in data analysis.
Sirovich, Lawrence
2011-09-20
This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.
Probabilistic Tsunami Hazard Analysis
Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.
2006-12-01
The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes
Implications of probabilistic risk assessment
Energy Technology Data Exchange (ETDEWEB)
Cullingford, M.C.; Shah, S.M.; Gittus, J.H. (eds.)
1987-01-01
Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.).
Gilmore, Casey S; Malone, Stephen M; Bernat, Edward M; Iacono, William G
2010-01-01
P3 amplitude reduction (P3-AR) is associated with biological vulnerability to a spectrum of externalizing disorders, such as ADHD, conduct disorder, and substance use disorders. P3, however, is generally characterized as a broad activation involving multiple neurophysiological processes. One approach to separating P3-related processes is time-frequency (TF) analysis. The current study used a novel PCA-based TF analysis method to investigate relationships between P3, its associated TF components, and externalizing in a community-based sample of adolescent males. Results showed that 1) alone, P3 and each TF-PCA derived component could successfully discriminate diagnostic groups from controls, and 2) delta components in specific time ranges accounted for variance beyond that accounted for by P3. One delta component was associated with all diagnostic groups, suggesting it may represent a more parsimonious endophenotype for externalizing than P3-AR.
Institute of Scientific and Technical Information of China (English)
胡睿; 王石; 吴锦昌; 沈丹青; 吴朝霞
2014-01-01
Objective To study the probability safety assessment to analyze and evaluate radiation error risk in the external beam radiotherapy,so as to establish and strengthen the control and management of the radiotherapy process,continuous improvement of quality control and quality management.Methods To build the whole of radiotherapy flow chart and process tree,using the decision tree model to determine critical control points in the whole process,making risk assessment chart and analyzing 4 patients with potential safety hazards error.Results The whole process is divided into 22 missions in 3 functional areas,the entire cover 15 branches and 59 key and 11 key control point.The enumeration of error as risks and critical control points has certain correlation.Conclusions Probabilistic safety assessment method have strengthened manage,analyze and control to risk,and all these provide the basis for developing and improving radiotherapy process control management.Radiotherapy quality management for future multidisciplinary and high-level management personnel who take up provides a prospective study.%目的 利用概率安全评价法分析和评估放疗过程中患者的误照风险,从而建立和加强放疗过程控制管理,持续改进QA和QC.方法 对整个放疗全程建立流程图和过程树图,利用判别树模型确定流程中误照风险的关键控制点,制定风险评估表并结合4例潜在误照安全隐患进行判定分析.结果 将整个流程分为3个功能区22个任务,全程涵盖15个分支59个关键操作和11个关键控制点.列举的误照隐患与关键点具有一定相关性.结论 概率安全评价法有利于对误照风险进行分析、管理和控制,为制定和改善放疗过程控制管理提供依据,并对未来多学科和高层次管理人员参与放疗的质量管理提供前瞻性研究方法.
Wijnhoven, Fons; Bloemen, Oscar
2014-01-01
Many publications in sentiment mining provide new techniques for improved accuracy in extracting features and corresponding sentiments in texts. For the external validity of these sentiment reports, i.e., the applicability of the results to target audiences, it is important to well analyze data of t
Schweizer, B
2005-01-01
Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.
Probabilistic Concurrent Kleene Algebra
Directory of Open Access Journals (Sweden)
Annabelle McIver
2013-06-01
Full Text Available We provide an extension of concurrent Kleene algebras to account for probabilistic properties. The algebra yields a unified framework containing nondeterminism, concurrency and probability and is sound with respect to the set of probabilistic automata modulo probabilistic simulation. We use the resulting algebra to generalise the algebraic formulation of a variant of Jones' rely/guarantee calculus.
Ignorability in Statistical and Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...
Directory of Open Access Journals (Sweden)
Igor V. Karyakin
2016-02-01
Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.
A logic for inductive probabilistic reasoning
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...
Probabilistic Algorithms in Robotics
Thrun, Sebastian
2000-01-01
This article describes a methodology for programming robots known as probabilistic robotics. The probabilistic paradigm pays tribute to the inherent uncertainty in robot perception, relying on explicit representations of uncertainty when determining what to do. This article surveys some of the progress in the field, using in-depth examples to illustrate some of the nuts and bolts of the basic approach. My central conjecture is that the probabilistic approach to robotics scales better to compl...
Probabilistic liver atlas construction
Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E.
2017-01-01
Background Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. Results A new method for probabilistic atlas con...
Probabilistic Logical Characterization
DEFF Research Database (Denmark)
Hermanns, Holger; Parma, Augusto; Segala, Roberto;
2011-01-01
Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance model...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....
Rodriguez, Esequiel; Weiss, Dana A; Ferretti, Max; Wang, Hong; Menshenia, Julia; Risbridger, Gail; Handelsman, David; Cunha, Gerald; Baskin, Laurence
2012-10-01
The objective of this study was to perform a comprehensive morphologic analysis of developing mouse external genitalia (ExG) and to determine specific sexual differentiation features that are responsive to androgens or estrogens. To eliminate sex steroid signaling postnatally, male and female mice were gonadectomized on the day of birth, and then injected intraperitoneally every other day with DES (200 ng/g), DHT (1 μg/g), or oil. On day-10 postnatal male and female ExG were dissected, fixed, embedded, serially sectioned and analyzed. We identified 10 sexually dimorphic anatomical features indicative of normal penile and clitoral differentiation in intact mice. Several (but not all) penile features were impaired or abolished as a result of neonatal castration. Those penile features remaining after neonatal castration were completely abolished with attendant clitoral development in androgen receptor (AR) mutant male mice (X(Tfm)/Y and X/Y AR-null) in which AR signaling is absent both pre- and postnatally. Administration of DHT to neonatally castrated males restored development of all 10 masculine features to almost normal levels. Neonatal ovariectomy of female mice had little effect on clitoral development, whereas treatment of ovariectomized female mice with DHT induced partial masculinization of the clitoris. Administration of DES to neonatally gonadectomized male and female mice elicited a spectrum of development abnormalities. These studies demonstrate that the presence or absence of androgen prenatally specifies penile versus clitoral identity. Differentiated penile features emerge postnatally and are sensitive to and dependent upon prenatal or pre- and postnatal androgen. Emergence of differentiated clitoral features occurs postnatally in either intact or ovariectomized females. It is likely that each penile and clitoral feature has a unique time-course of hormonal dependency/sensitivity.
Energy Technology Data Exchange (ETDEWEB)
Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id [Center for Regulatory Assessment of Nuclear Installation and Materials, Nuclear Energy Regulatory Agency (BAPETEN), Jl. Gajah Mada 8 Jakarta 10120 (Indonesia)
2014-09-30
Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.
Duplicate Detection in Probabilistic Data
Panse, Fabian; Keulen, van Maurice; Keijzer, de Ander; Ritter, Norbert
2009-01-01
Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused o
Energy Technology Data Exchange (ETDEWEB)
Loeffler, Horst; Kowalik, Michael; Mildenberger, Oliver; Hage, Michael
2016-06-15
The work which is documented here provides the methodological basis for improvement of the state of knowledge for accident sequences after plant external initiating events and for accident sequences which begin in the shutdown state. The analyses have been done for a PWR and for a BWR reference plant. The work has been supported by the German federal ministry BMUB under the label 3612R01361. Top objectives of the work are: - Identify relevant event sequences in order to define characteristic initial and boundary conditions - Perform accident analysis of selected sequences - Evaluate the relevance of accident sequences in a qualitative way The accident analysis is performed with the code MELCOR 1.8.6. The applied input data set has been significantly improved compared to previous analyses. The event tree method which is established in PSA level 2 has been applied for creating a structure for a unified summarization and evaluation of the results from the accident analyses. The computer code EVNTRE has been applied for this purpose. In contrast to a PSA level 2, the branching probabilities of the event tree have not been determined with the usual accuracy, but they are given in an approximate way only. For the PWR, the analyses show a considerable protective effect of the containment also in the case of beyond design events. For the BWR, there is a rather high probability for containment failure under core melt impact, but nevertheless the release of radionuclides into the environment is very limited because of plant internal retention mechanisms. This report concludes with remarks about existing knowledge gaps and with regard to core melt sequences, and about possible improvements of the plant safety.
Directory of Open Access Journals (Sweden)
Nicholas Joseph Matzke
2013-12-01
Full Text Available Historical biogeography has been characterized by a large diversity of methods and unresolved debates about which processes, such as dispersal or vicariance, are most important for explaining distributions. A new R package, BioGeoBEARS, implements many models in a common likelihood framework, so that standard statistical model selection procedures can be applied to let the data choose the best model. Available models include a likelihood version of DIVA (“DIVALIKE”, LAGRANGE’s DEC model, and BAYAREA, as well as “+J” versions of these models which include founder-event speciation, an important process left out of most inference methods. I use BioGeoBEARS on a large sample of island and non-island clades (including two fossil clades to show that founder-event speciation is a crucial process in almost every clade, and that most published datasets reject the non-J models currently in widespread use. BioGeoBEARS is open-source and freely available for installation at the Comprehensive R Archive Network at http://CRAN.R-project.org/package=BioGeoBEARS. A step-by-step tutorial is available at http://phylo.wikidot.com/biogeobears.
Probabilistic Dynamic Epistemic Logic
Kooi, B.P.
2003-01-01
In this paper I combine the dynamic epistemic logic of Gerbrandy (1999) with the probabilistic logic of Fagin and Halpern (1999). The result is a new probabilistic dynamic epistemic logic, a logic for reasoning about probability, information, and information change that takes higher order informatio
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian
2016-01-01
We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...
Refinement for Probabilistic Systems with Nondeterminism
Directory of Open Access Journals (Sweden)
David Streader
2011-06-01
Full Text Available Before we combine actions and probabilities two very obvious questions should be asked. Firstly, what does "the probability of an action" mean? Secondly, how does probability interact with nondeterminism? Neither question has a single universally agreed upon answer but by considering these questions at the outset we build a novel and hopefully intuitive probabilistic event-based formalism. In previous work we have characterised refinement via the notion of testing. Basically, if one system passes all the tests that another system passes (and maybe more we say the first system is a refinement of the second. This is, in our view, an important way of characterising refinement, via the question "what sort of refinement should I be using?" We use testing in this paper as the basis for our refinement. We develop tests for probabilistic systems by analogy with the tests developed for non-probabilistic systems. We make sure that our probabilistic tests, when performed on non-probabilistic automata, give us refinement relations which agree with for those non-probabilistic automata. We formalise this property as a vertical refinement.
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Exact and approximate probabilistic symbolic execution for nondeterministic programs
DEFF Research Database (Denmark)
Luckow, Kasper Søe; Păsăreanu, Corina S.; Dwyer, Matthew B.
2014-01-01
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable...
Probabilistic transmission system planning
Li, Wenyuan
2011-01-01
"The book is composed of 12 chapters and three appendices, and can be divided into four parts. The first part includes Chapters 2 to 7, which discuss the concepts, models, methods and data in probabilistic transmission planning. The second part, Chapters 8 to 11, addresses four essential issues in probabilistic transmission planning applications using actual utility systems as examples. Chapter 12, as the third part, focuses on a special issue, i.e. how to deal with uncertainty of data in probabilistic transmission planning. The fourth part consists of three appendices, which provide the basic knowledge in mathematics for probabilistic planning. Please refer to the attached table of contents which is given in a very detailed manner"--
Conditioning Probabilistic Databases
Koch, Christoph
2008-01-01
Past research on probabilistic databases has studied the problem of answering queries on a static database. Application scenarios of probabilistic databases however often involve the conditioning of a database using additional information in the form of new evidence. The conditioning problem is thus to transform a probabilistic database of priors into a posterior probabilistic database which is materialized for subsequent query processing or further refinement. It turns out that the conditioning problem is closely related to the problem of computing exact tuple confidence values. It is known that exact confidence computation is an NP-hard problem. This has lead researchers to consider approximation techniques for confidence computation. However, neither conditioning nor exact confidence computation can be solved using such techniques. In this paper we present efficient techniques for both problems. We study several problem decomposition methods and heuristics that are based on the most successful search techn...
Probabilistic Belief Logic and Its Probabilistic Aumann Semantics
Institute of Scientific and Technical Information of China (English)
CAO ZiNing(曹子宁); SHI ChunYi(石纯一)
2003-01-01
In this paper, we present a logic system for probabilistic belief named PBL,which expands the language of belief logic by introducing probabilistic belief. Furthermore, wegive the probabilistic Aumann semantics of PBL. We also list some valid properties of belief andprobabilistic belief, which form the deduction system of PBL. Finally, we prove the soundness andcompleteness of these properties with respect to probabilistic Aumann semantics.
Formalizing Probabilistic Safety Claims
Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.
2011-01-01
A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.
Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.
Probabilistic approach to mechanisms
Sandler, BZ
1984-01-01
This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.
Probabilistic conditional independence structures
Studeny, Milan
2005-01-01
Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.
A Probabilistic Analysis of the Sacco and Vanzetti Evidence
Kadane, Joseph B
2011-01-01
A Probabilistic Analysis of the Sacco and Vanzetti Evidence is a Bayesian analysis of the trial and post-trial evidence in the Sacco and Vanzetti case, based on subjectively determined probabilities and assumed relationships among evidential events. It applies the ideas of charting evidence and probabilistic assessment to this case, which is perhaps the ranking cause celebre in all of American legal history. Modern computation methods applied to inference networks are used to show how the inferential force of evidence in a complicated case can be graded. The authors employ probabilistic assess
Anita, G.; Selva, J.; Laura, S.
2011-12-01
We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).
A simulation model for probabilistic analysis of Space Shuttle abort modes
Hage, R. T.
1993-01-01
A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.
Brunswikian resources for event-perception research.
Kirlik, Alex
2009-01-01
Recent psychological research aimed at determining whether dynamic event perception is direct or mediated by cue-based inference convincingly demonstrates evidence of both modes of perception or apprehension. This work also shows that noise is involved in attaining any perceptual variable, whether it perfectly (invariantly) specifies or imperfectly (fallibly) indicates the value of a target or criterion variable. As such, event-perception researchers encounter both internal (sensory or inferential) and external ecological sources of noise or uncertainty, owing to the organism's possible use of imperfect or 'nonspecifying' variables (or cues) and cue-based inference. Because both sources play central roles in Egon Brunswik's theory of probabilistic functionalism and methodology of representative design, event-perception research will benefit by explicitly leveraging original Brunswikian and, more recent, neo-Brunswikian scientific resources. Doing so will result in a more coherent and powerful approach to perceptual and cognitive psychology than is currently displayed in the scientific literature.
Exact and Approximate Probabilistic Symbolic Execution
Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem
2014-01-01
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.
Energy Technology Data Exchange (ETDEWEB)
Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)
2017-04-15
Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.
Probabilistic Causation without Probability.
Holland, Paul W.
The failure of Hume's "constant conjunction" to describe apparently causal relations in science and everyday life has led to various "probabilistic" theories of causation of which the study by P. C. Suppes (1970) is an important example. A formal model that was developed for the analysis of comparative agricultural experiments…
Probabilistic simple sticker systems
Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod
2017-04-01
A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.
Probabilistic parsing strategies
Nederhof, Mark-Jan; Satta, Giorgio
We present new results on the relation between purely symbolic context-free parsing strategies and their probabilistic counterparts. Such parsing strategies are seen as constructions of push-down devices from grammars. We show that preservation of probability distribution is possible under two
Bergstra, J.A.; Middelburg, C.A.
2015-01-01
We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution
DEFF Research Database (Denmark)
Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte
2008-01-01
This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...
Probabilistic dynamic belief revision
Baltag, A.; Smets, S.
2008-01-01
We investigate the discrete (finite) case of the Popper-Renyi theory of conditional probability, introducing discrete conditional probabilistic models for knowledge and conditional belief, and comparing them with the more standard plausibility models. We also consider a related notion, that of safe
Progress for the Industry Application External Hazard Analyses Early Demonstration
Energy Technology Data Exchange (ETDEWEB)
Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ryan, Emerald [Idaho State Univ., Pocatello, ID (United States); Bhandari, Bishwo [Idaho State Univ., Pocatello, ID (United States); Sludern, Daniel [Idaho State Univ., Pocatello, ID (United States); Pope, Chad [Idaho State Univ., Pocatello, ID (United States); Sampath, Ram [Centroid PIC, Idaho Falls, ID (United States)
2015-09-01
This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communication and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.
Probabilistic authenticated quantum dialogue
Hwang, Tzonelih; Luo, Yi-Ping
2015-12-01
This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.
Probabilistic approaches to recommendations
Barbieri, Nicola; Ritacco, Ettore
2014-01-01
The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus
Geothermal probabilistic cost study
Energy Technology Data Exchange (ETDEWEB)
Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.
1981-08-01
A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)
On probabilistic Mandelbrot maps
Energy Technology Data Exchange (ETDEWEB)
Andreadis, Ioannis [International School of The Hague, Wijndaelerduin 1, 2554 BX The Hague (Netherlands)], E-mail: i.andreadis@ish-rijnlandslyceum.nl; Karakasidis, Theodoros E. [Department of Civil Engineering, University of Thessaly, GR-38334 Volos (Greece)], E-mail: thkarak@uth.gr
2009-11-15
In this work, we propose a definition for a probabilistic Mandelbrot map in order to extend and support the study initiated by Argyris et al. [Argyris J, Andreadis I, Karakasidis Th. On perturbations of the Mandelbrot map. Chaos, Solitons and Fractals 2000;11:1131-1136.] with regard to the numerical stability of the Mandelbrot and Julia set of the Mandelbrot map when subjected to noise.
Probabilistic thinking and death anxiety: a terror management based study.
Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S
2014-01-01
Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.
Probabilistic Seismic Hazard Analysis for Yemen
Directory of Open Access Journals (Sweden)
Rakesh Mohindra
2012-01-01
Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.
Probabilistic Modeling of Timber Structures
DEFF Research Database (Denmark)
Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro
2005-01-01
The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present pro...... probabilistic model for these basic properties is presented and possible refinements are given related to updating of the probabilistic model given new information, modeling of the spatial variation of strength properties and the duration of load effects.......The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...
Review of the Diablo Canyon probabilistic risk assessment
Energy Technology Data Exchange (ETDEWEB)
Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P. [Sandia National Lab., Albuquerque, NM (United States); Sabek, M.G. [Atomic Energy Authority, Nuclear Regulatory and Safety Center, Cairo (Egypt); Ravindra, M.K.; Johnson, J.J. [EQE Engineering, San Francisco, CA (United States)
1994-08-01
This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Term Seismic Program.
Probabilistic Abductive Logic Programming in Constraint Handling Rules
DEFF Research Database (Denmark)
Christiansen, Henning
A class of Probabilistic Abductive Logic Programs (PALPs) is introduced and an implementation is developed in CHR for solving abductive problems, providing minimal explanations with their probabilities. Both all-explanations and most-probable-explanations versions are given. % Compared with other...... probabilistic versions of abductive logic programming, the approach is characterized by higher generality and a flexible and adaptable architecture which incorporates integrity constraints and interaction with external constraint solvers. % A PALP is translated in a systematic way into a CHR program which...
Implementing Probabilistic Abductive Logic Programming with Constraint Handling Rules
DEFF Research Database (Denmark)
Christiansen, Henning
2008-01-01
A class of Probabilistic Abductive Logic Programs (PALPs) is introduced and an implementation is developed in CHR for solving abductive problems, providing minimal explanations with their probabilities. Both all-explanations and most-probable-explanations versions are given. Compared with other...... probabilistic versions of abductive logic programming, the approach is characterized by higher generality and a flexible and adaptable architecture which incorporates integrity constraints and interaction with external constraint solvers. A PALP is transformed in a systematic way into a CHR program which serves...
Storing and Querying Probabilistic XML Using a Probabilistic Relational DBMS
Hollander, E.S.; Keulen, van M.
2010-01-01
This work explores the feasibility of storing and querying probabilistic XML in a probabilistic relational database. Our approach is to adapt known techniques for mapping XML to relational data such that the possible worlds are preserved. We show that this approach can work for any XML-to-relational
Quantum probability for probabilists
Meyer, Paul-André
1993-01-01
In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.
Learning Probabilistic Decision Graphs
DEFF Research Database (Denmark)
Jaeger, Manfred; Dalgaard, Jens; Silander, Tomi
2004-01-01
Probabilistic decision graphs (PDGs) are a representation language for probability distributions based on binary decision diagrams. PDGs can encode (context-specific) independence relations that cannot be captured in a Bayesian network structure, and can sometimes provide computationally more...... efficient representations than Bayesian networks. In this paper we present an algorithm for learning PDGs from data. First experiments show that the algorithm is capable of learning optimal PDG representations in some cases, and that the computational efficiency of PDG models learned from real-life data...
Post-Fukushima Probabilistic Safety Enhancements of Industry
Energy Technology Data Exchange (ETDEWEB)
Na, Janghwan; Jeon, Ho-Jun; Lee, Hyun-Gyo [Korea Hydro and Nuclear Power Co. Ltd. Central Research Institute, Daejeon (Korea, Republic of)
2015-05-15
Nuclear concerned society as well as regulatory agency of Korea also asked several safety measures be included to the existing safety principles. These measures include the post-Fukushima near action items, several mid-long term obligations for severe accidents and rare external hazards which were disregarded due to unlikely event probabilities. This paper illustrates some activities being done or planned in view of probabilistic assessment boundaries; 1) Items currently performed by industry, 2) Regulatory measures which will impact to the industry activities, 3) Activities planned by mid-long bases. After the Fukushima accident, the significance of severe accidents and PSA came to the public as well as the industry itself. Among fifty safety-related plans, in this paper, we showed the implementation strategies and interim insights from LPSD PSA. The plans or activities now underway are further enhancing the safety for operating by introducing PSR and construction plants by inclusion of PSA insights into SAR. The main focus for safety improvement is targeted by not only the hardware improvement, but also systematic structure and effective operational improvement. The results of LPSD PSA implementation strategy will contribute to conforming of regulatory requirement and legislation of PSA which requests the application of extended scope of analysis, new methodology, PSA quality, living PSA through technically sound and application- specific PSA models.
A General Framework for Probabilistic Characterizing Formulae
DEFF Research Database (Denmark)
Sack, Joshua; Zhang, Lijun
2012-01-01
a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward......Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...
Probabilistic population aging
2017-01-01
We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675
Quantum probabilistic logic programming
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
Passage Retrieval: A Probabilistic Technique.
Melucci, Massimo
1998-01-01
Presents a probabilistic technique to retrieve passages from texts having a large size or heterogeneous semantic content. Results of experiments comparing the probabilistic technique to one based on a text segmentation algorithm revealed that the passage size affects passage retrieval performance; text organization and query generality may have an…
Probabilistic modeling of timber structures
DEFF Research Database (Denmark)
Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro
2007-01-01
The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet Publ...... is presented and possible refinements are given related to updating of the probabilistic model given new information, modeling of the spatial variation of strength properties and the duration of load effects.......The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...
Asteroid Risk Assessment: A Probabilistic Approach.
Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth
2016-02-01
Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.
Dynamical systems probabilistic risk assessment.
Energy Technology Data Exchange (ETDEWEB)
Denman, Matthew R.; Ames, Arlo Leroy
2014-03-01
Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.
Dynamical systems probabilistic risk assessment
Energy Technology Data Exchange (ETDEWEB)
Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ames, Arlo Leroy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2014-03-01
Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.
Treatment of Uncertainties in Probabilistic Tsunami Hazard
Thio, H. K.
2012-12-01
Over the last few years, we have developed a framework for developing probabilistic tsunami inundation maps, which includes comprehensive quantification of earthquake recurrence as well as uncertainties, and applied it to the development of a tsunami hazard map of California. The various uncertainties in tsunami source and propagation models are an integral part of a comprehensive probabilistic tsunami hazard analysis (PTHA), and often drive the hazard at low probability levels (i.e. long return periods). There is no unique manner in which uncertainties are included in the analysis although in general, we distinguish between "natural" or aleatory variability, such as slip distribution and event magnitude, and uncertainties due to an incomplete understanding of the behavior of the earth, called epistemic uncertainties, such as scaling relations and rupture segmentation. Aleatory uncertainties are typically included through integration over distribution functions based on regression analyses, whereas epistemic uncertainties are included using logic trees. We will discuss how the different uncertainties were included in our recent probabilistic tsunami inundation maps for California, and their relative importance on the final results. Including these uncertainties in offshore exceedance waveheights is straightforward, but the problem becomes more complicated once the non-linearity of near-shore propagation and inundation are encountered. By using the probabilistic off-shore waveheights as input level for the inundation models, the uncertainties up to that point can be included in the final maps. PTHA provides a consistent analysis of tsunami hazard and will become an important tool in diverse areas such as coastal engineering and land use planning. The inclusive nature of the analysis, where few assumptions are made a-priori as to which sources are significant, means that a single analysis can provide a comprehensive view of the hazard and its dominant sources
Quantitative analysis of probabilistic BPMN workflows
DEFF Research Database (Denmark)
Herbert, Luke Thomas; Sharp, Robin
2012-01-01
We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...
Assessing performance and validating finite element simulations using probabilistic knowledge
Energy Technology Data Exchange (ETDEWEB)
Dolin, Ronald M.; Rodriguez, E. A. (Edward A.)
2002-01-01
Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.
Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications
Energy Technology Data Exchange (ETDEWEB)
Coleman, Justin L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bolisetti, Chandu [Idaho National Lab. (INL), Idaho Falls, ID (United States); Veeraraghavan, Swetha [Idaho National Lab. (INL), Idaho Falls, ID (United States); Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gupta, Abhinav [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2016-09-01
Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporates deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.
Jongen, P.J.H.; Sindic, C.; Sanders, E.; Hawkins, S.; Linssen, W.; Munster, E. van; Frequin, S.T.F.M.; Borm, G.F.
2011-01-01
BACKGROUND: Due to methodological shortcomings the available post-registration data on the adverse events (AEs) occurring in interferon beta-1a (INFb-1a)-treated patients fail to adequately validate phase III data and only partially inform on safety in daily practice. We assessed AEs in relapsing
Probabilistic quantum multimeters
Fiurasek, J; Fiurasek, Jaromir; Dusek, Miloslav
2004-01-01
We propose quantum devices that can realize probabilistically different projective measurements on a qubit. The desired measurement basis is selected by the quantum state of a program register. First we analyze the phase-covariant multimeters for a large class of program states, then the universal multimeters for a special choice of program. In both cases we start with deterministic but erroneous devices and then proceed to devices that never make a mistake but from time to time they give an inconclusive result. These multimeters are optimized (for a given type of a program) with respect to the minimum probability of inconclusive result. This concept is further generalized to the multimeters that minimize the error rate for a given probability of an inconclusive result (or vice versa). Finally, we propose a generalization for qudits.
Probabilistic retinal vessel segmentation
Wu, Chang-Hua; Agam, Gady
2007-03-01
Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.
Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2004-01-01
We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...
Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2004-01-01
We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...
14th International Probabilistic Workshop
Taerwe, Luc; Proske, Dirk
2017-01-01
This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.
Probabilistic risk assessment of disassembly procedures
Energy Technology Data Exchange (ETDEWEB)
O`Brien, D.A.; Bement, T.R.; Letellier, B.C.
1993-11-01
The purpose of this report is to describe the use of Probabilistic Risk (Safety) Assessment (PRA or PSA) at a Department of Energy (DOE) facility. PRA is a methodology for (i) identifying combinations of events that, if they occur, lead to accidents (ii) estimating the frequency of occurrence of each combination of events and (iii) estimating the consequences of each accident. Specifically the study focused on evaluating the risks associated with dissembling a hazardous assembly. The PRA for the operation included a detailed evaluation only for those potential accident sequences which could lead to significant off-site consequences and affect public health. The overall purpose of this study was to investigate the feasibility of establishing a risk-consequence goal for DOE operations.
Ignorability in Statistical and Probabilistic Inference
Jaeger, M
2011-01-01
When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed by maintaining this proper distinction are often prohibitive, one asks for conditions under which it can be safely ignored. Such conditions are given by the missing at random (mar) and coarsened at random (car) assumptions. In this paper we provide an in-depth analysis of several questions relating to mar/car assumptions. Main purpose of our study is to provide criteria by which one may evaluate whether a car assumption is reasonable for a particular data collecting or observational process. This question is complicated by the fact that several distinct versions of mar/car assumptions exist. We therefore first provide an overview over these different versions, in which we highlight the distinction between distributional an...
Performing Probabilistic Risk Assessment Through RAVEN
Energy Technology Data Exchange (ETDEWEB)
A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita
2013-06-01
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module
Interval probabilistic neural network.
Kowalski, Piotr A; Kulczycki, Piotr
2017-01-01
Automated classification systems have allowed for the rapid development of exploratory data analysis. Such systems increase the independence of human intervention in obtaining the analysis results, especially when inaccurate information is under consideration. The aim of this paper is to present a novel approach, a neural networking, for use in classifying interval information. As presented, neural methodology is a generalization of probabilistic neural network for interval data processing. The simple structure of this neural classification algorithm makes it applicable for research purposes. The procedure is based on the Bayes approach, ensuring minimal potential losses with regard to that which comes about through classification errors. In this article, the topological structure of the network and the learning process are described in detail. Of note, the correctness of the procedure proposed here has been verified by way of numerical tests. These tests include examples of both synthetic data, as well as benchmark instances. The results of numerical verification, carried out for different shapes of data sets, as well as a comparative analysis with other methods of similar conditioning, have validated both the concept presented here and its positive features.
Probabilistic Resilience in Hidden Markov Models
Panerati, Jacopo; Beltrame, Giovanni; Schwind, Nicolas; Zeltner, Stefan; Inoue, Katsumi
2016-05-01
Originally defined in the context of ecological systems and environmental sciences, resilience has grown to be a property of major interest for the design and analysis of many other complex systems: resilient networks and robotics systems other the desirable capability of absorbing disruption and transforming in response to external shocks, while still providing the services they were designed for. Starting from an existing formalization of resilience for constraint-based systems, we develop a probabilistic framework based on hidden Markov models. In doing so, we introduce two new important features: stochastic evolution and partial observability. Using our framework, we formalize a methodology for the evaluation of probabilities associated with generic properties, we describe an efficient algorithm for the computation of its essential inference step, and show that its complexity is comparable to other state-of-the-art inference algorithms.
Institute of Scientific and Technical Information of China (English)
梅牡丹; 邵世威; 何兆忠; 陈堃
2014-01-01
始发事件分析是反应堆概率安全评价的起点.本文以10 MW固态钍基熔盐堆(Thorium Molten Salt Reactor,TMSR-SFl)为研究对象,采用主逻辑图分析方法,基于TMSR-SF1的最新概念设计,在参考已有氟盐冷却高温堆、高温气冷堆和钠冷快堆的始发事件清单和始发事件分析理论的基础上,针对TMSR-SF1始发事件分析进行初步探索研究,初步确定了TMSR-SF1的始发事件清单,共得到了TMSR-SF1的37个始发事件(功率运行情况下),并按照故障类型分类的方法对这些始发事件进行分组,共分为6组.为TMSR-SF1下一步的深入分析研究始发事件及其概率安全评价(Probabilistic safety assessment,PSA)中事故序列分析奠定了重要基础,也为安全分析的完整性提供了支持.
Ensemble postprocessing for probabilistic quantitative precipitation forecasts
Bentzien, S.; Friederichs, P.
2012-12-01
Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical
MODELING PROBABILISTIC CONFLICT OF TECHNOLOGICAL SYSTEMS
Directory of Open Access Journals (Sweden)
D. B. Desyatov
2015-01-01
Full Text Available Recently for the study of conflict increasingly used method of mathematical optical modeling. Its importance stems from the fact that experimental research such conflicts rather time-consuming and complex. However, existing approaches to the study of conflict do not take into account the stochastic nature of the systems, suffers from conceptual incompleteness. There is a need to develop models, algorithms and principles, in order to assess the conflict, to choose conflict resolution to ensure that not the worst of conditions. For stochastic technological systems as a utility function, we consider the probability of achieving a given objective. We assume that some system S1 is in conflict with the system S2, (SR2R К SR1R, if q(SR1R,SR2R probabilistic conflict of the second kind (А К2 B, if P(A/B
events, achieving some target States. Then-when, if and joint dependent random events, then the probability the conflict between events (And In can be defined in two ways: Definition 1. Between A and b is observed probabilistic conflict of the first kind (А К1 B, if P(A/B
Probabilistic aspects of Wigner function
Usenko, C V
2004-01-01
The Wigner function of quantum systems is an effective instrument to construct the approximate classical description of the systems for which the classical approximation is possible. During the last time, the Wigner function formalism is applied as well to seek indications of specific quantum properties of quantum systems leading to impossibility of the classical approximation construction. Most of all, as such an indication the existence of negative values in Wigner function for specific states of the quantum system being studied is used. The existence of such values itself prejudices the probabilistic interpretation of the Wigner function, though for an arbitrary observable depending jointly on the coordinate and the momentum of the quantum system just the Wigner function gives an effective instrument to calculate the average value and the other statistical characteristics. In this paper probabilistic interpretation of the Wigner function based on coordination of theoretical-probabilistic definition of the ...
Quantum probabilistically cloning and computation
Institute of Scientific and Technical Information of China (English)
2008-01-01
In this article we make a review on the usefulness of probabilistically cloning and present examples of quantum computation tasks for which quantum cloning offers an advantage which cannot be matched by any approach that does not resort to it.In these quantum computations,one needs to distribute quantum information contained in states about which we have some partial information.To perform quantum computations,one uses state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.And we discuss the achievable efficiencies and the efficient quantum logic network for probabilistic cloning the quantum states used in implementing quantum computation tasks for which cloning provides enhancement in performance.
Probabilistic assessment of pressurised thermal shocks
Energy Technology Data Exchange (ETDEWEB)
Pištora, Vladislav, E-mail: pis@ujv.cz; Pošta, Miroslav; Lauerová, Dana
2014-04-01
Rector pressure vessel (RPV) is a key component of all PWR and VVER nuclear power plants (NPPs). Assuring its integrity is therefore of high importance. Due to high neutron fluence the RPV material is embrittled during NPP operation. The embrittled RPV may undergo severe loading during potential events of the type of pressurised thermal shock (PTS), possibly occurring in the NPP. The resistance of RPV against fast fracture has to be proven by comprehensive analyses. In most countries (with exception of the USA), proving RPV integrity is based on the deterministic PTS assessment. In the USA, the “screening criteria” for maximum allowable embrittlement of RPV material, which form part of the USA regulations, are based on the probabilistic PTS assessment. In other countries, probabilistic PTS assessment is performed only at research level or as supplementary to the deterministic PTS assessment for individual RPVs. In this paper, description of complete probabilistic PTS assessment for a VVER 1000 RPV is presented, in particular, both the methodology and the results are attached. The methodology corresponds to the Unified Procedure for Lifetime Assessment of Components and Piping in WWER NPPs, “VERLIFE”, Version 2008. The main parameters entering the analysis, which are treated as statistical distributions, are as follows: -initial value of material reference temperature T{sub 0}, -reference temperature shift ΔT{sub 0} due to neutron fluence, -neutron fluence, -size, shape, position and density of cracks in the RPV wall, -fracture toughness of RPV material (Master Curve concept is used). The first step of the analysis consists in selection of sequences potentially leading to PTS, their grouping, establishing their frequencies, and selecting of representative scenarios within all groups. Modified PSA model is used for this purpose. The second step consists in thermal hydraulic analyses of the representative scenarios, with the goal to prepare input data for the
HISTORY BASED PROBABILISTIC BACKOFF ALGORITHM
Directory of Open Access Journals (Sweden)
Narendran Rajagopalan
2012-01-01
Full Text Available Performance of Wireless LAN can be improved at each layer of the protocol stack with respect to energy efficiency. The Media Access Control layer is responsible for the key functions like access control and flow control. During contention, Backoff algorithm is used to gain access to the medium with minimum probability of collision. After studying different variations of back off algorithms that have been proposed, a new variant called History based Probabilistic Backoff Algorithm is proposed. Through mathematical analysis and simulation results using NS-2, it is seen that proposed History based Probabilistic Backoff algorithm performs better than Binary Exponential Backoff algorithm.
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard
, new and more refined design methods must be developed. These methods can for instance be developed using probabilistic design where the uncertainties in all phases of the design life are taken into account. The main aim of the present thesis is to develop models for probabilistic design of wind....... The uncertainty related to the existing methods for estimating the loads during operation is assessed by applying these methods to a case where the load response is assumed to be Gaussian. In this case an approximate analytical solution exists for a statistical description of the extreme load response. In general...
Probabilistic methods in combinatorial analysis
Sachkov, Vladimir N
2014-01-01
This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist
Probabilistic Approach to Rough Set Theory
Institute of Scientific and Technical Information of China (English)
Wojciech Ziarko
2006-01-01
The presentation introduces the basic ideas and investigates the probabilistic approach to rough set theory. The major aspects of the probabilistic approach to rough set theory to be explored during the presentation are: the probabilistic view of the approximation space, the probabilistic approximations of sets, as expressed via variable precision and Bayesian rough set models, and probabilistic dependencies between sets and multi-valued attributes, as expressed by the absolute certainty gain and expected certainty gain measures, respectively. The probabilis-tic dependency measures allow for representation of subtle stochastic associations between attributes. They also allow for more comprehensive evaluation of rules computed from data and for computation of attribute reduct, core and significance factors in probabilistic decision tables. It will be shown that the probabilistic dependency measure-based attribute reduction techniques are also extendible to hierarchies of decision tables. The presentation will include computational examples to illustrate pre-sented concepts and to indicate possible practical applications.
Probabilistic forecasts based on radar rainfall uncertainty
Liguori, S.; Rico-Ramirez, M. A.
2012-04-01
gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.
Probabilistic Logic Programming under Answer Sets Semantics
Institute of Scientific and Technical Information of China (English)
王洁; 鞠实儿
2003-01-01
Although traditional logic programming languages provide powerful tools for knowledge representation, they cannot deal with uncertainty information (e. g. probabilistic information). In this paper, we propose a probabilistic logic programming language by introduce probability into a general logic programming language. The work combines 4-valued logic with probability. Conditional probability can be easily represented in a probabilistic logic program. The semantics of such a probabilistic logic program i...
A Probabilistic Ontology Development Methodology
2014-06-01
Model-Based Systems Engineering (MBSE) Methodologies," Seattle, 2008. [17] Jeffrey O. Grady, System Requirements Analysis. New York: McGraw-Hill, Inc...software. [Online]. http://www.norsys.com/index.html [26] Lise Getoor, Nir Friedman, Daphne Koller, Avi Pfeffer , and Ben Taskar, "Probabilistic
Probabilistic aspects of ocean waves
Battjes, J.A.
1977-01-01
Background material for a special lecture on probabilistic aspects of ocean waves for a seminar in Trondheim. It describes long term statistics and short term statistics. Statistical distributions of waves, directional spectra and frequency spectra. Sea state parameters, response peaks, encounter
Probabilistic aspects of ocean waves
Battjes, J.A.
1977-01-01
Background material for a special lecture on probabilistic aspects of ocean waves for a seminar in Trondheim. It describes long term statistics and short term statistics. Statistical distributions of waves, directional spectra and frequency spectra. Sea state parameters, response peaks, encounter pr
Sound Probabilistic #SAT with Projection
Directory of Open Access Journals (Sweden)
Vladimir Klebanov
2016-10-01
Full Text Available We present an improved method for a sound probabilistic estimation of the model count of a boolean formula under projection. The problem solved can be used to encode a variety of quantitative program analyses, such as concerning security of resource consumption. We implement the technique and discuss its application to quantifying information flow in programs.
Probabilistic localisation in repetitive environments
Vroegindeweij, Bastiaan A.; IJsselmuiden, Joris; Henten, van Eldert J.
2016-01-01
One of the problems in loose housing systems for laying hens is the laying of eggs on the floor, which need to be collected manually. In previous work, PoultryBot was presented to assist in this and other tasks. Here, probabilistic localisation with a particle filter is evaluated for use inside p
The Probabilistic Model of Keys Generation of QKD Systems
Golubchikov, Dmitry
2010-01-01
The probabilistic model of keys generation of QKD systems is proposed. The model includes all phases of keys generation starting from photons generation to states detection taking characteristics of fiber-optics components into account. The paper describes the tree of events of QKD systems. Equations are found for estimation of the effectiveness of the process of sifted keys generation as well as for bit-error probability and for the rate of private keys generation.
A probabilistic Hu-Washizu variational principle
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
Model Checking with Probabilistic Tabled Logic Programming
Gorlin, Andrey; Smolka, Scott A
2012-01-01
We present a formulation of the problem of probabilistic model checking as one of query evaluation over probabilistic logic programs. To the best of our knowledge, our formulation is the first of its kind, and it covers a rich class of probabilistic models and probabilistic temporal logics. The inference algorithms of existing probabilistic logic-programming systems are well defined only for queries with a finite number of explanations. This restriction prohibits the encoding of probabilistic model checkers, where explanations correspond to executions of the system being model checked. To overcome this restriction, we propose a more general inference algorithm that uses finite generative structures (similar to automata) to represent families of explanations. The inference algorithm computes the probability of a possibly infinite set of explanations directly from the finite generative structure. We have implemented our inference algorithm in XSB Prolog, and use this implementation to encode probabilistic model...
A probabilistic strategy for parametric catastrophe insurance
Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin
2017-04-01
Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss
Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions
Ostenaa, D.; O'Connell, D.; Creed, B.
2009-05-01
The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the
Reliability and Probabilistic Risk Assessment - How They Play Together
Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang
2015-01-01
PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard
, new and more refined design methods must be developed. These methods can for instance be developed using probabilistic design where the uncertainties in all phases of the design life are taken into account. The main aim of the present thesis is to develop models for probabilistic design of wind......, the uncertainty is dependent on the method used for load extrapolation, the number of simulations and the distribution fitted to the extracted peaks. Another approach for estimating the uncertainty on the estimated load effects during operation is to use field measurements. A new method for load extrapolation......, which is based on average conditional exceedence rates, is applied to wind turbine response. The advantage of this method is that it can handle dependence in the response and use exceedence rates instead of extracted peaks which normally are more stable. The results show that the method estimates...
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Toft, H.S.
2010-01-01
Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....
Probabilistic Design of Wind Turbines
Directory of Open Access Journals (Sweden)
Henrik S. Toft
2010-02-01
Full Text Available Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability levels and recommendation for consideration of system aspects. The uncertainties are characterized as aleatoric (physical uncertainty or epistemic (statistical, measurement and model uncertainties. Methods for uncertainty modeling consistent with methods for estimating the reliability are described. It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated.
Modified Claus process probabilistic model
Energy Technology Data Exchange (ETDEWEB)
Larraz Mora, R. [Chemical Engineering Dept., Univ. of La Laguna (Spain)
2006-03-15
A model is proposed for the simulation of an industrial Claus unit with a straight-through configuration and two catalytic reactors. Process plant design evaluations based on deterministic calculations does not take into account the uncertainties that are associated with the different input variables. A probabilistic simulation method was applied in the Claus model to obtain an impression of how some of these inaccuracies influences plant performance. (orig.)
Probabilistic Cloning and Quantum Computation
Institute of Scientific and Technical Information of China (English)
GAO Ting; YAN Feng-Li; WANG Zhi-Xi
2004-01-01
@@ We discuss the usefulness of quantum cloning and present examples of quantum computation tasks for which the cloning offers an advantage which cannot be matched by any approach that does not resort to quantum cloning.In these quantum computations, we need to distribute quantum information contained in the states about which we have some partial information. To perform quantum computations, we use a state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.
Probabilistic analysis and related topics
Bharucha-Reid, A T
1983-01-01
Probabilistic Analysis and Related Topics, Volume 3 focuses on the continuity, integrability, and differentiability of random functions, including operator theory, measure theory, and functional and numerical analysis. The selection first offers information on the qualitative theory of stochastic systems and Langevin equations with multiplicative noise. Discussions focus on phase-space evolution via direct integration, phase-space evolution, linear and nonlinear systems, linearization, and generalizations. The text then ponders on the stability theory of stochastic difference systems and Marko
Probabilistic methods for rotordynamics analysis
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
Probabilistic analysis and related topics
Bharucha-Reid, A T
1979-01-01
Probabilistic Analysis and Related Topics, Volume 2 focuses on the integrability, continuity, and differentiability of random functions, as well as functional analysis, measure theory, operator theory, and numerical analysis.The selection first offers information on the optimal control of stochastic systems and Gleason measures. Discussions focus on convergence of Gleason measures, random Gleason measures, orthogonally scattered Gleason measures, existence of optimal controls without feedback, random necessary conditions, and Gleason measures in tensor products. The text then elaborates on an
Multivariate postprocessing techniques for probabilistic hydrological forecasting
Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian
2016-04-01
Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power
Probabilistic interpretation of resonant states
Indian Academy of Sciences (India)
Naomichi Hatano; Tatsuro Kawamoto; Joshua Feinberg
2009-09-01
We provide probabilistic interpretation of resonant states. We do this by showing that the integral of the modulus square of resonance wave functions (i.e., the conventional norm) over a properly expanding spatial domain is independent of time, and therefore leads to probability conservation. This is in contrast with the conventional employment of a bi-orthogonal basis that precludes probabilistic interpretation, since wave functions of resonant states diverge exponentially in space. On the other hand, resonant states decay exponentially in time, because momentum leaks out of the central scattering area. This momentum leakage is also the reason for the spatial exponential divergence of resonant state. It is by combining the opposite temporal and spatial behaviours of resonant states that we arrive at our probabilistic interpretation of these states. The physical need to normalize resonant wave functions over an expanding spatial domain arises because particles leak out of the region which contains the potential range and escape to infinity, and one has to include them in the total count of particles.
Probabilistic Seismic Hazard Assessment of Babol, Iran
Directory of Open Access Journals (Sweden)
Gholamreza Abdollahzadeh
2011-01-01
Full Text Available This paper presents a probabilistic seismic hazard assessment of Babol, one of big cities in north of Iran. Many destructive earthquakes happened in Iran in the last centuries. It comes from historical references that at least many times; Babol has been destroyed by catastrophic earthquakes. In this paper, the peak horizontal ground acceleration over the bedrock (PGA is calculated by a probabilistic seismic hazard assessment (PSHA. For this reason, at first, a collected catalogue, containing both historical and instrumental events that occurred in a radius of 200 km of Babol city and covering the period from 874 to 2004 have been gathered. Then, seismic sources are modeled and recur¬rence relationship is established. After elimination of the aftershocks and foreshocks, the main earthquakes were taken into consideration to calculate the seismic parameters (SP by Kijko method. The calculations were performed using the logic tree method and four weighted attenuation relationships Ghodrati, 0.35, Khademi, 0.25, Ambraseys and Simpson, 0.2, and Sarma and Srbulov, 0.2. Seismic hazard assessment is then carried out for 8 horizontal by 7 vertical lines grid points using SEISRISK III. Finally, two seismic hazard maps of the studied area based on Peak Horizontal Ground Acceleration (PGA over bedrock for 2 and 10% probability of ex¬ceedance in one life cycles of 50 year are presented. These calculations have been performed by the Poisson distribution of two hazard levels. The results showed that the PGA ranges from 0.32 to 0.33 g for a return period of 475 years and from 0.507 to 0.527 g for a return period of 2475 years. Since population is very dense in Babol and vulnerability of buildings is high, the risk of future earthquakes will be very significant.
Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment
Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.
2016-01-01
Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.
Probabilistic models of language processing and acquisition.
Chater, Nick; Manning, Christopher D
2006-07-01
Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.
Probabilistic Ensemble Forecast of Summertime Temperatures in Pakistan
Directory of Open Access Journals (Sweden)
Muhammad Hanif
2014-01-01
Full Text Available Snowmelt flooding triggered by intense heat is a major temperature related weather hazard in northern Pakistan, and the frequency of such extreme flood events has increased during the recent years. In this study, the probabilistic temperature forecasts at seasonal and subseasonal time scales based on hindcasts simulations from three state-of-the-art models within the DEMETER project are assessed by the relative operating characteristic (ROC verification method. Results based on direct model outputs reveal significant skill for hot summers in February 3–5 (ROC area=0.707 with lower 95% confidence limit of 0.538 and February 4-5 (ROC area=0.771 with lower 95% confidence limit of 0.623 forecasts when validated against observations. Results for ERA-40 reanalysis also show skill for hot summers. Skilful probabilistic ensemble forecasts of summertime temperatures may be valuable in providing the foreknowledge of snowmelt flooding and water management in Pakistan.
A framework for probabilistic pluvial flood nowcasting for urban areas
DEFF Research Database (Denmark)
Ntegeka, Victor; Murla, Damian; Wang, Lipen;
2016-01-01
the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme...... was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS...... (12.5 – 50 m2) and low flood hazard areas (75 – 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based...
External Otitis (Swimmer's Ear)
... to Pneumococcal Vaccine Additional Content Medical News External Otitis (Swimmer's Ear) By Bradley W. Kesser, MD, Associate ... the Outer Ear Ear Blockages Ear Tumors External Otitis (Swimmer's Ear) Malignant External Otitis Perichondritis External otitis ...
Staged decision making based on probabilistic forecasting
Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris
2016-04-01
Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in
Probabilistic Planning with Imperfect Sensing Actions Using Hybrid Probabilistic Logic Programs
Saad, Emad
Effective planning in uncertain environment is important to agents and multi-agents systems. In this paper, we introduce a new logic based approach to probabilistic contingent planning (probabilistic planning with imperfect sensing actions), by relating probabilistic contingent planning to normal hybrid probabilistic logic programs with probabilistic answer set semantics [24]. We show that any probabilistic contingent planning problem can be encoded as a normal hybrid probabilistic logic program. We formally prove the correctness of our approach. Moreover, we show that the complexity of finding a probabilistic contingent plan in our approach is NP-complete. In addition, we show that any probabilistic contingent planning problem, \\cal PP, can be encoded as a classical normal logic program with answer set semantics, whose answer sets corresponds to valid trajectories in \\cal PP. We show that probabilistic contingent planning problems can be encoded as SAT problems. We present a new high level probabilistic action description language that allows the representation of sensing actions with probabilistic outcomes.
Advanced Seismic Probabilistic Risk Assessment Demonstration Project Plan
Energy Technology Data Exchange (ETDEWEB)
Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2014-09-01
Idaho National Laboratories (INL) has an ongoing research and development (R&D) project to remove excess conservatism from seismic probabilistic risk assessments (SPRA) calculations. These risk calculations should focus on providing best estimate results, and associated insights, for evaluation and decision-making. This report presents a plan for improving our current traditional SPRA process using a seismic event recorded at a nuclear power plant site, with known outcomes, to improve the decision making process. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures that can lead to a seismic induced core damage event. However, in general this approach has been conservative, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility).
Dey, Mahua; Stadnik, Agnieszka; Riad, Fady; Zhang, Lingjiao; McBee, Nichol; Kase, Carlos; Carhuapoma, J Ricardo; Ram, Malathi; Lane, Karen; Ostapkovich, Noeleen; Aldrich, Francois; Aldrich, Charlene; Jallo, Jack; Butcher, Ken; Snider, Ryan; Hanley, Daniel; Ziai, Wendy; Awad, Issam A
2015-03-01
Retrospective series report varied rates of bleeding and infection with external ventricular drainage (EVD). There have been no prospective studies of these risks with systematic surveillance, threshold definitions, or independent adjudication. To analyze the rate of complications in the ongoing Clot Lysis: Evaluating Accelerated Resolution of Intraventricular Hemorrhage Phase III (CLEAR III) trial, providing a comparison with a systematic review of complications of EVD in the literature. Patients were prospectively enrolled in the CLEAR III trial after placement of an EVD for obstructive intraventricular hemorrhage and randomized to receive recombinant tissue-type plasminogen activator or placebo. We counted any detected new hemorrhage (catheter tract hemorrhage or any other distant hemorrhage) on computed tomography scan within 30 days from the randomization. Meta-analysis of published series of EVD placement was compiled with STATA software. Growing or unstable hemorrhage was reported as a cause of exclusion from the trial in 74 of 5707 cases (1.3%) screened for CLEAR III. The first 250 patients enrolled have completed adjudication of adverse events. Forty-two subjects (16.8%) experienced ≥1 new bleeds or expansions, and 6 of 250 subjects (2.4%) suffered symptomatic hemorrhages. Eleven cases (4.4%) had culture-proven bacterial meningitis or ventriculitis. Risks of bleeding and infection in the ongoing CLEAR III trial are comparable to those previously reported in EVD case series. In the present study, rates of new bleeds and bacterial meningitis/ventriculitis are very low despite multiple daily injections, blood in the ventricles, the use of thrombolysis in half the cases, and generalization to >60 trial sites.
A Level 1+ Probabilistic Safety Assessment of the High Flux Australian Reactor. Vol 3: Appendices
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-01-01
The third volume of the Probabilistic Safety Assessment contains supporting information for the PSA as follows: Appendix C (continued) with details of the system analysis and reports for the system/top event models; Appendix D with results of the specific engineering analyses of internal initiating events; Appendix E, containing supporting data for the human performance assessment,; Appendix F with details of the estimation of the frequency of leaks at HIFAR and Appendix G, containing event sequence model and quantification results
Probabilistic Flood Defence Assessment Tools
Directory of Open Access Journals (Sweden)
Slomp Robert
2016-01-01
institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands
Probabilistic modeling of financial exposure to flood in France
Moncoulon, David; Quantin, Antoine; Leblois, Etienne
2014-05-01
CCR is a French reinsurance company which offers natural catastrophe covers with the State guarantee. Within this framework, CCR develops its own models to assess its financial exposure to floods, droughts, earthquakes and other perils, and thus the exposure of insurers and the French State. A probabilistic flood model has been developed in order to estimate the financial exposure of the Nat Cat insurance market to flood events, depending on their annual occurrence probability. This presentation is organized in two parts. The first part is dedicated to the development of a flood hazard and damage model (ARTEMIS). The model calibration and validation on historical events are then described. In the second part, the coupling of ARTEMIS with two generators of probabilistic events is achieved: a stochastic flow generator and a stochastic spatialized precipitation generator, adapted from the SAMPO model developed by IRSTEA. The analysis of the complementary nature of these two generators is proposed: the first one allows generating floods on the French hydrological station network; the second allows simulating surface water runoff and Small River floods, even on ungauged rivers. Thus, the simulation of thousands of non-occured, but possible events allows us to provide for the first time an estimate of the financial exposure to flooding in France at different scales (commune, department, country) and from different points of view (hazard, vulnerability and damages).
-Boundedness and -Compactness in Finite Dimensional Probabilistic Normed Spaces
Indian Academy of Sciences (India)
Reza Saadati; Massoud Amini
2005-11-01
In this paper, we prove that in a finite dimensional probabilistic normed space, every two probabilistic norms are equivalent and we study the notion of -compactness and -boundedness in probabilistic normed spaces.
Benaloh's Dense Probabilistic Encryption Revisited
Fousse, Laurent; Alnuaimi, Mohamed
2010-01-01
In 1994, Josh Benaloh proposed a probabilistic homomorphic encryption scheme, enhancing the poor expansion factor provided by Goldwasser and Micali's scheme. Since then, numerous papers have taken advantage of Benaloh's homomorphic encryption function, including voting schemes, non-interactive verifiable secret sharing, online poker... In this paper we show that the original description of the scheme is incorrect, possibly resulting in ambiguous decryption of ciphertexts. We give a corrected description of the scheme, provide a complete proof of correctness and an analysis of the probability of failure in the initial description.
Probabilistic Analysis of Crack Width
Directory of Open Access Journals (Sweden)
J. Marková
2000-01-01
Full Text Available Probabilistic analysis of crack width of a reinforced concrete element is based on the formulas accepted in Eurocode 2 and European Model Code 90. Obtained values of reliability index b seem to be satisfactory for the reinforced concrete slab that fulfils requirements for the crack width specified in Eurocode 2. However, the reliability of the slab seems to be insufficient when the European Model Code 90 is considered; reliability index is less than recommended value 1.5 for serviceability limit states indicated in Eurocode 1. Analysis of sensitivity factors of basic variables enables to find out variables significantly affecting the total crack width.
Probabilistic safety assessment for Hanford high-level waste tank 241-SY-101
Energy Technology Data Exchange (ETDEWEB)
MacFarlane, D.R.; Bott, T.F.; Brown, L.F.; Stack, D.W. [Los Alamos National Lab., NM (United States); Kindinger, J.; Deremer, R.K.; Medhekar, S.R.; Mikschl, T.J. [PLG, Inc., Newport Beach, CA (United States)
1994-05-01
Los Alamos National Laboratory (Los Alamos) is performing a comprehensive probabilistic safety assessment (PSA), which will include consideration of external events for the 18 tank farms at the Hanford Site. This effort is sponsored by the Department of Energy (DOE/EM, EM-36). Even though the methodology described herein will be applied to the entire tank farm, this report focuses only on the risk from the weapons-production wastes stored in tank number 241-SY-101, commonly known as Tank 101-SY, as configured in December 1992. This tank, which periodically releases ({open_quotes}burps{close_quotes}) a gaseous mixture of hydrogen, nitrous oxide, ammonia, and nitrogen, was analyzed first because of public safety concerns associated with the potential for release of radioactive tank contents should this gas mixture be ignited during one of the burps. In an effort to mitigate the burping phenomenon, an experiment is being conducted in which a large pump has been inserted into the tank to determine if pump-induced circulation of the tank contents will promote a slow, controlled release of the gases. At the Hanford Site there are 177 underground tanks in 18 separate tank farms containing accumulated liquid/sludge/salt cake radioactive wastes from 50 yr of weapons materials production activities. The total waste volume is about 60 million gal., which contains approximately 120 million Ci of radioactivity.
Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)
Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete
2017-01-01
The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.
Modeling Events with Cascades of Poisson Processes
Simma, Aleksandr
2012-01-01
We present a probabilistic model of events in continuous time in which each event triggers a Poisson process of successor events. The ensemble of observed events is thereby modeled as a superposition of Poisson processes. Efficient inference is feasible under this model with an EM algorithm. Moreover, the EM algorithm can be implemented as a distributed algorithm, permitting the model to be applied to very large datasets. We apply these techniques to the modeling of Twitter messages and the revision history of Wikipedia.
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Why do probabilistic finite element analysis ?
Thacker, B H
2008-01-01
The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.
Function Approximation Using Probabilistic Fuzzy Systems
J.H. van den Berg (Jan); U. Kaymak (Uzay); R.J. Almeida e Santos Nogueira (Rui Jorge)
2011-01-01
textabstractWe consider function approximation by fuzzy systems. Fuzzy systems are typically used for approximating deterministic functions, in which the stochastic uncertainty is ignored. We propose probabilistic fuzzy systems in which the probabilistic nature of uncertainty is taken into account.
Probabilistic Remaining Useful Life Prediction of Composite Aircraft Components Project
National Aeronautics and Space Administration — A Probabilistic Fatigue Damage Assessment Network (PFDAN) toolkit for Abaqus will be developed for probabilistic life management of a laminated composite structure...
Semantics of sub-probabilistic programs
Institute of Scientific and Technical Information of China (English)
Yixing CHEN; Hengyang WU
2008-01-01
The aim of this paper is to extend the probabil-istic choice in probabilistic programs to sub-probabilistic choice, i.e., of the form (p)P (q) Q where p + q ≤ 1. It means that program P is executed with probability p and program Q is executed with probability q. Then, start-ing from an initial state, the execution of a sub-probabil-istic program results in a sub-probability distribution. This paper presents two equivalent semantics for a sub-probabilistic while-programming language. One of these interprets programs as sub-probabilistic distributions on state spaces via denotational semantics. The other inter-prets programs as bounded expectation transformers via wp-semantics. This paper proposes an axiomatic systems for total logic, and proves its soundness and completeness in a classical pattern on the structure of programs.
Probabilistic sequence alignment of stratigraphic records
Lin, Luan; Khider, Deborah; Lisiecki, Lorraine E.; Lawrence, Charles E.
2014-10-01
The assessment of age uncertainty in stratigraphically aligned records is a pressing need in paleoceanographic research. The alignment of ocean sediment cores is used to develop mutually consistent age models for climate proxies and is often based on the δ18O of calcite from benthic foraminifera, which records a global ice volume and deep water temperature signal. To date, δ18O alignment has been performed by manual, qualitative comparison or by deterministic algorithms. Here we present a hidden Markov model (HMM) probabilistic algorithm to find 95% confidence bands for δ18O alignment. This model considers the probability of every possible alignment based on its fit to the δ18O data and transition probabilities for sedimentation rate changes obtained from radiocarbon-based estimates for 37 cores. Uncertainty is assessed using a stochastic back trace recursion to sample alignments in exact proportion to their probability. We applied the algorithm to align 35 late Pleistocene records to a global benthic δ18O stack and found that the mean width of 95% confidence intervals varies between 3 and 23 kyr depending on the resolution and noisiness of the record's δ18O signal. Confidence bands within individual cores also vary greatly, ranging from ~0 to >40 kyr. These alignment uncertainty estimates will allow researchers to examine the robustness of their conclusions, including the statistical evaluation of lead-lag relationships between events observed in different cores.
A Probabilistic Asteroid Impact Risk Model
Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.
2016-01-01
Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.
Augmenting Probabilistic Risk Assesment with Malevolent Initiators
Energy Technology Data Exchange (ETDEWEB)
Curtis Smith; David Schwieder
2011-11-01
As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operating plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.
Probabilistic Seismic Hazard Assessment for Taiwan
Directory of Open Access Journals (Sweden)
Yu-Ju Wang
2016-06-01
Full Text Available The Taiwan Earthquake Model (TEM was established to assess the seismic hazard and risk for Taiwan by considering the social and economic impacts of various components from geology, seismology, and engineering. This paper gives the first version of TEM probabilistic seismic hazard analysis for Taiwan in these aspects. We named it TEM PSHA2015. The model adopts the source parameters of 38 seismogenic structures identified by TEM geologists. In addition to specific fault source-based categorization, seismic activities are categorized as shallow, subduction intraplate, and subduction interplate events. To evaluate the potential ground-shaking resulting from each seismic source, the corresponding ground-motion prediction equations for crustal and subduction earthquakes are adopted. The highest hazard probability is evaluated to be in Southwestern Taiwan and the Longitudinal Valley of Eastern Taiwan. Among the special municipalities in the highly populated Western Taiwan region, Taichung, Tainan, and New Taipei City are evaluated to have the highest hazard. Tainan has the highest seismic hazard for peak ground acceleration in the model based on TEM fault parameters. In terms of pseudo-spectral acceleration, Tainan has higher hazard over short spectral periods, whereas Taichung has higher hazard over long spectral periods. The analysis indicates the importance of earthquake-resistant designs for low-rise buildings in Tainan and high-rise buildings in Taichung.
Global Infrasound Association Based on Probabilistic Clutter Categorization
Arora, Nimar; Mialle, Pierrick
2016-04-01
The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013
Energy Technology Data Exchange (ETDEWEB)
Mata, Jonatas F.C. da; Vasconcelos, Vanderley de; Mesquita, Amir Z., E-mail: jonatasfmata@yahoo.com.br, E-mail: vasconv@cdtn.br, E-mail: amir@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)
2015-07-01
The nuclear accident at Fukushima Daiichi, occurred in Japan in 2011, brought reflections, worldwide, on the management of nuclear and environmental licensing processes of existing nuclear reactors. One of the key lessons learned in this matter, is that the studies of Probabilistic Safety Assessment and Severe Accidents are becoming essential, even in the early stage of a nuclear development project. In Brazil, Brazilian Nuclear Energy Commission, CNEN, conducts the nuclear licensing. The organism responsible for the environmental licensing is Brazilian Institute of Environment and Renewable Natural Resources, IBAMA. In the scope of the licensing processes of these two institutions, the safety analysis is essentially deterministic, complemented by probabilistic studies. The Probabilistic Safety Assessment (PSA) is the study performed to evaluate the behavior of the nuclear reactor in a sequence of events that may lead to the melting of its core. It includes both probability and consequence estimation of these events, which are called Severe Accidents, allowing to obtain the risk assessment of the plant. Thus, the possible shortcomings in the design of systems are identified, providing basis for safety assessment and improving safety. During the environmental licensing, a Quantitative Risk Analysis (QRA), including probabilistic evaluations, is required in order to support the development of the Risk Analysis Study, the Risk Management Program and the Emergency Plan. This article aims to provide an overview of probabilistic risk assessment methodologies and their applications in nuclear and environmental licensing processes of nuclear reactors in Brazil. (author)
Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst
2015-04-01
The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological
Probabilistic Flood Mapping using Volunteered Geographical Information
Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.
2016-12-01
Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).
Probabilistic seismic hazard assessment for Central Asia
Directory of Open Access Journals (Sweden)
Shahid Ullah
2015-04-01
Full Text Available Central Asia is one of the seismically most active regions in the world. Its complex seismicity due to the collision of the Eurasian and Indian plates has resulted in some of the world’s largest intra-plate events over history. The region is dominated by reverse faulting over strike slip and normal faulting events. The GSHAP project (1999, aiming at a hazard assessment on a global scale, indicated that the region of Central Asia is characterized by peak ground accelerations for 10% probability of exceedance in 50 years as high as 9 m/s2. In this study, carried out within the framework of the EMCA project (Earthquake Model Central Asia, the area source model and different kernel approaches are used for a probabilistic seismic hazard assessment (PSHA for Central Asia. The seismic hazard is assessed considering shallow (depth < 50 km seismicity only and employs an updated (with respect to previous projects earthquake catalog for the region. The seismic hazard is calculated in terms of macroseismic intensity (MSK-64, intended to be used for the seismic risk maps of the region. The hazard maps, shown in terms of 10% probability of exceedance in 50 years, are derived by using the OpenQuake software [Pagani et al. 2014], which is an open source software tool developed by the GEM (Global Earthquake Model foundation. The maximum hazard observed in the region reaches an intensity of around 8 in southern Tien Shan for 475 years mean return period. The maximum hazard estimated for some of the cities in the region, Bishkek, Dushanbe, Tashkent and Almaty, is between 7 and 8 (7-8, 8.0, 7.0 and 8.0 macroseismic Intensity, respectively, for 475 years mean return period, using different approaches. The results of different methods for assessing the level of seismic hazard are compared and their underlying methodologies are discussed.
DEFF Research Database (Denmark)
Papakonstantinou, Athanasios; Rogers, Alex; Gerding, Enrico H.
2011-01-01
This paper reports on the design of a novel two-stage mechanism, based on strictly proper scoring rules, that allows a centre to acquire a costly forecast of a future event (such as a meteorological phenomenon) or a probabilistic estimate of a specific parameter (such as the quality of an expecte...
Probabilistic Fatigue Damage Program (FATIG)
Michalopoulos, Constantine
2012-01-01
FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.
The Complexity of Probabilistic Lobbying
Erdélyi, Gábor; Goldsmith, Judy; Mattei, Nicholas; Raible, Daniel; Rothe, Jörg
2009-01-01
We propose various models for lobbying in a probabilistic environment, in which an actor (called "The Lobby") seeks to influence the voters' preferences of voting for or against multiple issues when the voters' preferences are represented in terms of probabilities. In particular, we provide two evaluation criteria and three bribery methods to formally describe these models, and we consider the resulting forms of lobbying with and without issue weighting. We provide a formal analysis for these problems of lobbying in a stochastic environment, and determine their classical and parameterized complexity depending on the given bribery/evaluation criteria. Specifically, we show that some of these problems can be solved in polynomial time, some are NP-complete but fixed-parameter tractable, and some are W[2]-complete. Finally, we provide (in)approximability results.
Machine learning a probabilistic perspective
Murphy, Kevin P
2012-01-01
Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...
Probabilistic simulation of fire scenarios
Energy Technology Data Exchange (ETDEWEB)
Hostikka, Simo E-mail: simo.bostikka@vtt.fi; Keski-Rahkonen, Olavi
2003-10-01
A risk analysis tool is developed for computation of the distributions of fire model output variables. The tool, called Probabilistic Fire Simulator (PFS), combines Monte Carlo simulation and CFAST, a two-zone fire model. In this work, the tool is used to estimate the failure probability of redundant cables in a cable tunnel fire, and the failure and smoke filling probabilities in an electronics room during an electronics cabinet fire. Sensitivity of the output variables to the input variables is calculated in terms of the rank order correlations. The use of the rank order correlations allows the user to identify both modelling parameters and actual facility properties that have the most influence on the results. Various steps of the simulation process, i.e. data collection, generation of the input distributions, modelling assumptions, definition of the output variables and the actual simulation, are described.
Probabilistic direct counterfactual quantum communication
Zhang, Sheng
2017-02-01
It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).
Probabilistic cloning with supplementary information
Azuma, K; Koashi, M; Imoto, N; Azuma, Koji; Shimamura, Junichi; Koashi, Masato; Imoto, Nobuyuki
2005-01-01
We consider probabilistic cloning of a state chosen from a mutually nonorthogonal set of pure states, with the help of a party holding supplementary information in the form of pure states. When the number of states is two, we show that the best efficiency of producing m copies is always achieved by a two-step protocol in which the helping party first attempts to produce m-1 copies from the supplementary state, and if it fails, then the original state is used to produce m copies. On the other hand, when the number of states exceeds two, the best efficiency is not always achieved by such a protocol. We give examples in which the best efficiency is not achieved even if we allow any amount of one-way classical communication from the helping party.
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic machine learning and artificial intelligence
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
A History of Probabilistic Inductive Logic Programming
Directory of Open Access Journals (Sweden)
Fabrizio eRiguzzi
2014-09-01
Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.
Probabilistic Modeling of Graded Timber Material Properties
DEFF Research Database (Denmark)
Faber, M. H.; Köhler, J.; Sørensen, John Dalsgaard
2004-01-01
The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for quality grading in the production line. It is shown how statistical models may be established...... an important role in the overall probabilistic modeling. Therefore a scheme for estimating the parameters of probability distribution parameters focusing on the tail behavior has been established using a censored Maximum Likelihood estimation technique. The proposed probabilistic models have been formulated...
Probabilistic UML statecharts for specification and verification: a case study
Jansen, D.N.; Jürjens, J.; Cengarle, M.V.; Fernandez, E.B.; Rumpe, B.; Sander, R.
2002-01-01
This paper introduces a probabilistic extension of UML statecharts. A requirements-level semantics of statecharts is extended to include probabilistic elements. Desired properties for probabilistic statecharts are expressed in the probabilistic logic PCTL, and verified using the model checker Prism.
Probabilistic Tsunami Hazard Assessment - Application to the Mediterranean Sea
Sorensen, M. B.; Spada, M.; Babeyko, A.; Wiemer, S.; Grünthal, G.
2009-12-01
Following several large tsunami events around the world in the recent years, the tsunami hazard is becoming an increasing concern. The traditional way of assessing tsunami hazard has been through deterministic scenario calculations which provide the expected wave heights due to a given tsunami source, usually a worst-case scenario. For quantitative hazard and risk assessment, however, it is necessary to move towards a probabilistic framework. In this study we focus on earthquake generated tsunamis and present a scheme for probabilistic tsunami hazard assessment (PTHA). Our PTHA methodology is based on the use of Monte-Carlo simulations and follows probabilistic seismic hazard assessment methodologies closely. The PTHA is performed in four steps. First, earthquake and tsunami catalogues are analyzed in order to define a number of potential tsunami sources in the study area. For each of these sources, activity rates, maximum earthquake magnitude and uncertainties are assigned. Following, a synthetic earthquake catalogue is established, based on the information about the sources. The third step is to calculate multiple synthetic tsunami scenarios for all potentially tsunamigenic earthquakes in the synthetic catalogue. The tsunami scenarios are then combined at the fourth step to generate hazard curves and maps. We implement the PTHA methodology in the Mediterranean Sea, where numerous tsunami events have been reported in history. We derive a 100000 year-long catalog of potentially tsunamigenic earthquakes and calculate tsunami propagation scenarios for ca. 85000 M6.5+ earthquakes from the synthetic catalog. Results show that the highest tsunami hazard is attributed to the Eastern Mediterranean region, but that also the Western Mediterranean can experience significant tsunami waves for long return periods. Hazard maps will be presented for a range of probability levels together with hazard curves for selected critical locations.
Weighing costs and losses: A decision making game using probabilistic forecasts
Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan
2017-04-01
Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting
From quantum feedback to probabilistic error correction: manipulation of quantum beats in cavity QED
Energy Technology Data Exchange (ETDEWEB)
Barberis-Blostein, P [Instituto de Investigaciones en Matematicas Aplicadas y en Sistemas, Universidad Nacional Autonoma de Mexico, Ciudad Universitaria, 04510, Mexico, DF (Mexico); Norris, D G; Orozco, L A; Carmichael, H J [Joint Quantum Institute, Department of Physics, University of Maryland and National Institute of Standards and Technology, College Park, MD 20742 (United States)], E-mail: lorozco@umd.edu
2010-02-15
It is shown how one can implement quantum feedback and probabilistic error correction in an open quantum system consisting of a single atom, with ground- and excited-state Zeeman structure, in a driven two-mode optical cavity. The ground-state superposition is manipulated and controlled through conditional measurements and external fields, which shield the coherence and correct quantum errors. Modeling an experimentally realistic situation demonstrates the robustness of the proposal for realization in the laboratory.
Probabilistic analysis of linear elastic cracked structures
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
This paper presents a probabilistic methodology for linear fracture mechanics analysis of cracked structures. The main focus is on probabilistic aspect related to the nature of crack in material. The methodology involves finite element analysis; statistical models for uncertainty in material properties, crack size, fracture toughness and loads; and standard reliability methods for evaluating probabilistic characteristics of linear elastic fracture parameter. The uncertainty in the crack size can have a significant effect on the probability of failure, particularly when the crack size has a large coefficient of variation. Numerical example is presented to show that probabilistic methodology based on Monte Carlo simulation provides accurate estimates of failure probability for use in linear elastic fracture mechanics.
Structural reliability codes for probabilistic design
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager
1997-01-01
difficulties of ambiguity and definition show up when attempting to make the transition from a given authorized partial safety factor code to a superior probabilistic code. For any chosen probabilistic code format there is a considerable variation of the reliability level over the set of structures defined...... considerable variation of the reliability measure as defined by a specific probabilistic code format. Decision theoretical principles are applied to get guidance about which of these different reliability levels of existing practice to choose as target reliability level. Moreover, it is shown that the chosen...... probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...
Revising incompletely specified convex probabilistic belief bases
CSIR Research Space (South Africa)
Rens, G
2016-04-01
Full Text Available International Workshop on Non-Monotonic Reasoning (NMR), 22-24 April 2016, Cape Town, South Africa Revising Incompletely Specified Convex Probabilistic Belief Bases Gavin Rens CAIR_, University of KwaZulu-Natal, School of Mathematics, Statistics...
Non-unitary probabilistic quantum computing
Gingrich, Robert M.; Williams, Colin P.
2004-01-01
We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.
Do probabilistic forecasts lead to better decisions?
Ramos, M. H.; van Andel, S. J.; Pappenberger, F.
2013-06-01
The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.
Safety Verification for Probabilistic Hybrid Systems
DEFF Research Database (Denmark)
Zhang, Lijun; She, Zhikun; Ratschan, Stefan;
2010-01-01
The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...... of classical hybrid systems we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...
Safety Verification for Probabilistic Hybrid Systems
DEFF Research Database (Denmark)
Zhang, Lijun; She, Zhikun; Ratschan, Stefan;
2012-01-01
The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based...... of classical hybrid systems, we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...
Improved transformer protection using probabilistic neural network ...
African Journals Online (AJOL)
user
This article presents a novel technique to distinguish between magnetizing inrush ... Protective relaying, Probabilistic neural network, Active power relays, Power ... Forward Neural Network (MFFNN) with back-propagation learning technique.
Probabilistic composition of preferences, theory and applications
Parracho Sant'Anna, Annibal
2015-01-01
Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.
Strategic Team AI Path Plans: Probabilistic Pathfinding
Directory of Open Access Journals (Sweden)
Tng C. H. John
2008-01-01
Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.
Application of probabilistic precipitation forecasts from a ...
African Journals Online (AJOL)
Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... An ensemble set of 30 adjacent basins is then identified as ensemble members for each ...
Probabilistic Analysis Methods for Hybrid Ventilation
DEFF Research Database (Denmark)
Brohus, Henrik; Frier, Christian; Heiselberg, Per
This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application of stoc...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....
PROBABILISTIC METHODOLOGY OF LOW CYCLE FATIGUE ANALYSIS
Institute of Scientific and Technical Information of China (English)
Jin Hui; Wang Jinnuo; Wang Libin
2003-01-01
The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analy sis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are consid ered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.
DEMPSTER-SHAFER THEORY BY PROBABILISTIC REASONING
Directory of Open Access Journals (Sweden)
Chiranjib Mukherjee
2015-10-01
Full Text Available Probabilistic reasoning is used when outcomes are unpredictable. We examine the methods which use probabilistic representations for all knowledge and which reason by propagating the uncertainties can arise from evidence and assertions to conclusions. The uncertainties can arise from an inability to predict outcomes due to unreliable, vague, in complete or inconsistent knowledge. Some approaches taken in Artificial Intelligence system to deal with reasoning under similar types of uncertain conditions.
Probabilistic nature in L/H transition
Energy Technology Data Exchange (ETDEWEB)
Toda, Shinichiro; Itoh, Sanae-I.; Yagi, Masatoshi [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka; Fukuyama, Atsushi
1999-11-01
Statistical picture for an excitation of a plasma transition is examined, which occurs in a strongly turbulent state. The physical picture of transition phenomena is extended to include the statistical variances. The dynamics of the plasma density and turbulent-driven flux is studied with hysteresis nature in the flux-density relation. The probabilistic excitation is predicted and the critical conditions are described by the probabilistic distribution function. The stability for the model equations is also discussed. (author)
Semantics of probabilistic processes an operational approach
Deng, Yuxin
2015-01-01
This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us
Houle, Cyril O.
This book examines the external degree in relation to the extremes of attitudes, myths, and data. Emphasis is placed on the emergence of the American external degree, foreign external-degree programs, the purpose of the external degree, the current scene, institutional issues, and problems of general policy. (MJM)
Probabilistic Prediction of Lifetimes of Ceramic Parts
Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.
2006-01-01
ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.
Probabilistic Choice, Reversibility, Loops, and Miracles
Stoddart, Bill; Bell, Pete
We consider an addition of probabilistic choice to Abrial's Generalised Substitution Language (GSL) in a form that accommodates the backtracking interpretation of non-deterministic choice. Our formulation is introduced as an extension of the Prospective Values formalism we have developed to describe the results from a backtracking search. Significant features are that probabilistic choice is governed by feasibility, and non-termination is strict. The former property allows us to use probabilistic choice to generate search heuristics. In this paper we are particularly interested in iteration. By demonstrating sub-conjunctivity and monotonicity properties of expectations we give the basis for a fixed point semantics of iterative constructs, and we consider the practical proof treatment of probabilistic loops. We discuss loop invariants, loops with probabilistic behaviour, and probabilistic termination in the context of a formalism in which a small probability of non-termination can dominate our calculations, proposing a method of limits to avoid this problem. The formal programming constructs described have been implemented in a reversible virtual machine (RVM).
Computing Distances between Probabilistic Automata
Directory of Open Access Journals (Sweden)
Mathieu Tracol
2011-07-01
Full Text Available We present relaxed notions of simulation and bisimulation on Probabilistic Automata (PA, that allow some error epsilon. When epsilon is zero we retrieve the usual notions of bisimulation and simulation on PAs. We give logical characterisations of these notions by choosing suitable logics which differ from the elementary ones, L with negation and L without negation, by the modal operator. Using flow networks, we show how to compute the relations in PTIME. This allows the definition of an efficiently computable non-discounted distance between the states of a PA. A natural modification of this distance is introduced, to obtain a discounted distance, which weakens the influence of long term transitions. We compare our notions of distance to others previously defined and illustrate our approach on various examples. We also show that our distance is not expansive with respect to process algebra operators. Although L without negation is a suitable logic to characterise epsilon-(bisimulation on deterministic PAs, it is not for general PAs; interestingly, we prove that it does characterise weaker notions, called a priori epsilon-(bisimulation, which we prove to be NP-difficult to decide.
Pearl A Probabilistic Chart Parser
Magerman, D M; Magerman, David M.; Marcus, Mitchell P.
1994-01-01
This paper describes a natural language parsing algorithm for unrestricted text which uses a probability-based scoring function to select the "best" parse of a sentence. The parser, Pearl, is a time-asynchronous bottom-up chart parser with Earley-type top-down prediction which pursues the highest-scoring theory in the chart, where the score of a theory represents the extent to which the context of the sentence predicts that interpretation. This parser differs from previous attempts at stochastic parsers in that it uses a richer form of conditional probabilities based on context to predict likelihood. Pearl also provides a framework for incorporating the results of previous work in part-of-speech assignment, unknown word models, and other probabilistic models of linguistic features into one parsing tool, interleaving these techniques instead of using the traditional pipeline architecture. In preliminary tests, Pearl has been successful at resolving part-of-speech and word (in speech processing) ambiguity, dete...
Optimal probabilistic dense coding schemes
Kögler, Roger A.; Neves, Leonardo
2017-04-01
Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.
Probabilistic description of traffic flow
Mahnke, R.; Kaupužs, J.; Lubashevsky, I.
2005-03-01
A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.
Probabilistic description of traffic breakdowns.
Kühne, Reinhart; Mahnke, Reinhard; Lubashevsky, Ihor; Kaupuzs, Jevgenijs
2002-06-01
We analyze the characteristic features of traffic breakdown. To describe this phenomenon we apply the probabilistic model regarding the jam emergence as the formation of a large car cluster on a highway. In these terms, the breakdown occurs through the formation of a certain critical nucleus in the metastable vehicle flow, which enables us to confine ourselves to one cluster model. We assume that, first, the growth of the car cluster is governed by attachment of cars to the cluster whose rate is mainly determined by the mean headway distance between the car in the vehicle flow and, maybe, also by the headway distance in the cluster. Second, the cluster dissolution is determined by the car escape from the cluster whose rate depends on the cluster size directly. The latter is justified using the available experimental data for the correlation properties of the synchronized mode. We write the appropriate master equation converted then into the Fokker-Planck equation for the cluster distribution function and analyze the formation of the critical car cluster due to the climb over a certain potential barrier. The further cluster growth irreversibly causes jam formation. Numerical estimates of the obtained characteristics and the experimental data of the traffic breakdown are compared. In particular, we draw a conclusion that the characteristic intrinsic time scale of the breakdown phenomenon should be about 1 min and explain the case why the traffic volume interval inside which traffic breakdown is observed is sufficiently wide.
An optimal control approach to probabilistic Boolean networks
Liu, Qiuli
2012-12-01
External control of some genes in a genetic regulatory network is useful for avoiding undesirable states associated with some diseases. For this purpose, a number of stochastic optimal control approaches have been proposed. Probabilistic Boolean networks (PBNs) as powerful tools for modeling gene regulatory systems have attracted considerable attention in systems biology. In this paper, we deal with a problem of optimal intervention in a PBN with the help of the theory of discrete time Markov decision process. Specifically, we first formulate a control model for a PBN as a first passage model for discrete time Markov decision processes and then find, using a value iteration algorithm, optimal effective treatments with the minimal expected first passage time over the space of all possible treatments. In order to demonstrate the feasibility of our approach, an example is also displayed.
Orhan, A Emin; Ma, Wei Ji
2017-07-26
Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.
Exner, Cornelia; Zetsche, Ulrike; Lincoln, Tania M; Rief, Winfried
2014-03-01
A tendency to overestimate threat has been shown in individuals with OCD. We tested the hypothesis that this bias in judgment is related to difficulties in learning probabilistic associations between events. Thirty participants with OCD and 30 matched healthy controls completed a learning experiment involving 2 variants of a probabilistic classification learning task. In the neutral weather-prediction task, rainy and sunny weather had to be predicted. In the emotional task danger of an epidemic from virus infection had to be predicted (epidemic-prediction task). Participants with OCD were as able as controls to improve their prediction of neutral events across learning trials but scored significantly below healthy controls on the epidemic-prediction task. Lower performance on the emotional task variant was significantly related to a heightened tendency to overestimate threat. Biased information processing in OCD might thus hamper corrective experiences regarding the probability of threatening events.
Stimuli, Reinforcers, and Private Events
Nevin, John A.
2008-01-01
Radical behaviorism considers private events to be a part of ongoing observable behavior and to share the properties of public events. Although private events cannot be measured directly, their roles in overt action can be inferred from mathematical models that relate private responses to external stimuli and reinforcers according to the same…
Stimuli, Reinforcers, and Private Events
Nevin, John A.
2008-01-01
Radical behaviorism considers private events to be a part of ongoing observable behavior and to share the properties of public events. Although private events cannot be measured directly, their roles in overt action can be inferred from mathematical models that relate private responses to external stimuli and reinforcers according to the same…
Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study
Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.
2004-12-01
A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr
Regulating multiple externalities
DEFF Research Database (Denmark)
Waldo, Staffan; Jensen, Frank; Nielsen, Max
2016-01-01
Open access is a well-known externality problem in fisheries causing excess capacity and overfishing. Due to global warming, externality problems from CO2 emissions have gained increased interest. With two externality problems, a first-best optimum can be achieved by using two regulatory instrume......Open access is a well-known externality problem in fisheries causing excess capacity and overfishing. Due to global warming, externality problems from CO2 emissions have gained increased interest. With two externality problems, a first-best optimum can be achieved by using two regulatory...
ExternE National Implementation Finland
Energy Technology Data Exchange (ETDEWEB)
Pingoud, K.; Maelkki, H.; Wihersaari, M.; Pirilae, P. [VTT Energy, Espoo (Finland); Hongisto, M. [Imatran Voima Oy, Vantaa (Finland); Siitonen, S. [Ekono Energy Ltd, Espoo (Finland); Johansson, M. [Finnish Environment Institute, Helsinki (Finland)
1999-07-01
ExternE National Implementation is a continuation of the ExternE Project, funded in part by the European Commission's Joule III Programme. This study is the result of the ExternE National Implementation Project for Finland. Three fuel cycles were selected for the Finnish study: coal, peat and wood-derived biomass, which together are responsible for about 40% of total electricity generation in Finland and about 75% of the non-nuclear fuel based generation. The estimated external costs or damages were dominated by the global warming (GW) impacts in the coal and peat fuel cycles, but knowledge of the true GW impacts is still uncertain. From among other impacts that were valued in monetary terms the human health damages due to airborne emissions dominated in all the three fuel cycles. Monetary valuation for ecosystem impacts is not possible using the ExternE methodology at present. The Meri-Pori power station representing the coal fuel cycle is one of the world's cleanest and most efficient coal-fired power plants with a condensing turbine. The coal is imported mainly from Poland. The estimated health damages were about 4 mECU/kWh, crop damages an order of magnitude lower and damages caused to building materials two orders of magnitude lower. The power stations of the peat and biomass fuel cycles are of CHP type, generating electricity and heat for the district heating systems of two cities. Their fuels are of domestic origin. The estimated health damages allocated to electricity generation were about 5 and 6 mECU/kWh, respectively. The estimates were case-specific and thus an generalisation of the results to the whole electricity generation in Finland is unrealistic. Despite the uncertainties and limitations of the methodology, it is a promising tool in the comparison of similar kinds of fuel cycles, new power plants and pollution abatement technologies and different plant locations with each other. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Galan, S.F. [Dpto. de Inteligencia Artificial, E.T.S.I. Informatica (UNED), Juan del Rosal, 16, 28040 Madrid (Spain)]. E-mail: seve@dia.uned.es; Mosleh, A. [2100A Marie Mount Hall, Materials and Nuclear Engineering Department, University of Maryland, College Park, MD 20742 (United States)]. E-mail: mosleh@umd.edu; Izquierdo, J.M. [Area de Modelado y Simulacion, Consejo de Seguridad Nuclear, Justo Dorado, 11, 28040 Madrid (Spain)]. E-mail: jmir@csn.es
2007-08-15
The {omega}-factor approach is a method that explicitly incorporates organizational factors into Probabilistic safety assessment of nuclear power plants. Bayesian networks (BNs) are the underlying formalism used in this approach. They have a structural part formed by a graph whose nodes represent organizational variables, and a parametric part that consists of conditional probabilities, each of them quantifying organizational influences between one variable and its parents in the graph. The aim of this paper is twofold. First, we discuss some important limitations of current procedures in the {omega}-factor approach for either assessing conditional probabilities from experts or estimating them from data. We illustrate the discussion with an example that uses data from Licensee Events Reports of nuclear power plants for the estimation task. Second, we introduce significant improvements in the way BNs for the {omega}-factor approach can be constructed, so that parameter acquisition becomes easier and more intuitive. The improvements are based on the use of noisy-OR gates as model of multicausal interaction between each BN node and its parents.
Probabilistic numerics and uncertainty in computations.
Hennig, Philipp; Osborne, Michael A; Girolami, Mark
2015-07-08
We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.
ASH External Web Portal (External Portal) -
Department of Transportation — The ASH External Web Portal is a web-based portal that provides single sign-on functionality, making the web portal a single location from which to be authenticated...
A Level 1+ Probabilistic Safety Assessment of the High Flux Australian Reactor. Vol 1
Energy Technology Data Exchange (ETDEWEB)
NONE
1998-01-01
The Department of Industry, Science and Tourism selected PLG, an EQE International Company, to systematically and independently evaluate the safety of the High Flux Australian Reactor (HIFAR), located at Lucas Heights, New South Wales. PLG performed a comprehensive probabilistic safety assessment (PSA) to quantify the risks posed by operation of HIFAR . The PSA identified possible accident scenarios, estimated their likelihood of occurrence, and assigned each scenario to a consequence category; i.e., end state. The accident scenarios developed included the possible release of radioactive material from irradiated nuclear fuel and of tritium releases from reactor coolant. The study team developed a recommended set of safety criteria against which the results of the PSA may be judged. HIFAR was found to exceed one of the two primary safety objectives and two of the five secondary safety objectives. Reactor coolant leaks, earthquakes, and coolant pump trips were the accident initiators that contributed most to scenarios that could result in fuel overheating. Scenarios initiated by earthquakes were the reason the frequency criterion for the one primary safety objective was exceeded. Overall, the plant safety status has been shown to be generally good with no evidence of major safety-related problems from its operation. One design deficiency associated with the emergency core cooling system was identified that should be corrected as soon as possible. Additionally, several analytical issues have been identified that should be investigated further. The results from these additional investigations should be used to determine whether additional plant and procedural changes are required, or if further evaluations of postulated severe accidents are warranted. Supporting information can be found in Appendix A for the seismic analysis and in the Appendix B for selected other external events refs., 139 tabs., 85 figs. Prepared for Department of Industry, Science and Tourism
Probabilistic Fuzzy Approach to Evaluation of Logistics Service Effectiveness
Directory of Open Access Journals (Sweden)
Rudnik Katarzyna
2014-12-01
Full Text Available Logistics service providers offer a whole or partial logistics business service over a certain time period. Between such companies, the effectiveness of specific logistics services can vary. Logistics service providers seek the effective performance of logistics service. The purpose of this paper is to present a new approach for the evaluation of logistics service effectiveness, along with a specific computer system implementing the proposed approach – a sophisticated inference system, an extension of the Mamdani probabilistic fuzzy system. The paper presents specific knowledge concerning the relationships between effectiveness indicators in the form of fuzzy rules which contain marginal and conditional probabilities of fuzzy events. An inference diagram is also shown. A family of Yager's parameterized t-norms is proposed as inference operators. It facilitates the optimization of system parameters and enables flexible adjustment of the system to empirical data. A case study was used to illustrate the new approach for the evaluation of logistics service effectiveness. The approach is demonstrated on logistics services in a logistics company. We deem the analysis of a probabilistic fuzzy knowledge base to be useful for the evaluation of effectiveness of logistics services in a logistics company over a given time period.
Probabilistic assessment of agricultural droughts using graphical models
Ramadas, Meenu; Govindaraju, Rao S.
2015-07-01
Agricultural droughts are often characterized by soil moisture in the root zone of the soil, but crop needs are rarely factored into the analysis. Since water needs vary with crops, agricultural drought incidences in a region can be characterized better if crop responses to soil water deficits are also accounted for in the drought index. This study investigates agricultural droughts driven by plant stress due to soil moisture deficits using crop stress functions available in the literature. Crop water stress is assumed to begin at the soil moisture level corresponding to incipient stomatal closure, and reaches its maximum at the crop's wilting point. Using available location-specific crop acreage data, a weighted crop water stress function is computed. A new probabilistic agricultural drought index is then developed within a hidden Markov model (HMM) framework that provides model uncertainty in drought classification and accounts for time dependence between drought states. The proposed index allows probabilistic classification of the drought states and takes due cognizance of the stress experienced by the crop due to soil moisture deficit. The capabilities of HMM model formulations for assessing agricultural droughts are compared to those of current drought indices such as standardized precipitation evapotranspiration index (SPEI) and self-calibrating Palmer drought severity index (SC-PDSI). The HMM model identified critical drought events and several drought occurrences that are not detected by either SPEI or SC-PDSI, and shows promise as a tool for agricultural drought studies.
Probabilistic spill occurrence simulation for chemical spills management.
Cao, Weihua; Li, James; Joksimovic, Darko; Yuan, Arnold; Banting, Doug
2013-11-15
Inland chemical spills pose a great threat to water quality in worldwide area. A sophisticated probabilistic spill-event model that characterizes temporal and spatial randomness and quantifies statistical uncertainty due to limited spill data is a major component in spill management and associated decision making. This paper presents a MATLAB-based Monte Carlo simulation (MMCS) model for simulating the probabilistic quantifiable occurrences of inland chemical spills by time, magnitude, and location based on North America Industry Classification System codes. The model's aleatory and epistemic uncertainties were quantified through integrated bootstrap resampling technique. Benzene spills in the St. Clair River area of concern were used as a case to demonstrate the model by simulating spill occurrences, occurrence time, and mass expected for a 10-year period. Uncertainty analysis indicates that simulated spill characteristics can be described by lognormal distributions with positive skewness. The simulated spill time series will enable a quantitative risk analysis for water quality impairments due to the spills. The MMCS model can also help governments to evaluate their priority list of spilled chemicals. Copyright © 2013 Elsevier B.V. All rights reserved.
Machine learning, computer vision, and probabilistic models in jet physics
CERN. Geneva; NACHMAN, Ben
2015-01-01
In this talk we present recent developments in the application of machine learning, computer vision, and probabilistic models to the analysis and interpretation of LHC events. First, we will introduce the concept of jet-images and computer vision techniques for jet tagging. Jet images enabled the connection between jet substructure and tagging with the fields of computer vision and image processing for the first time, improving the performance to identify highly boosted W bosons with respect to state-of-the-art methods, and providing a new way to visualize the discriminant features of different classes of jets, adding a new capability to understand the physics within jets and to design more powerful jet tagging methods. Second, we will present Fuzzy jets: a new paradigm for jet clustering using machine learning methods. Fuzzy jets view jet clustering as an unsupervised learning task and incorporate a probabilistic assignment of particles to jets to learn new features of the jet structure. In particular, we wi...
Probabilistic Multi-Hazard Assessment of Dry Cask Structures
Energy Technology Data Exchange (ETDEWEB)
Bencturk, Bora [Univ. of Houston, TX (United States); Padgett, Jamie [Rice Univ., Houston, TX (United States); Uddin, Rizwan [Univ. of Illinois, Urbana-Champaign, IL (United States).
2017-01-10
systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environments are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.
Flood forecasting using medium-range probabilistic weather prediction
Directory of Open Access Journals (Sweden)
B. T. Gouweleeuw
2005-01-01
Full Text Available Following the developments in short- and medium-range weather forecasting over the last decade, operational flood forecasting also appears to show a shift from a so-called single solution or 'best guess' deterministic approach towards a probabilistic approach based on ensemble techniques. While this probabilistic approach is now more or less common practice and well established in the meteorological community, operational flood forecasters have only started to look for ways to interpret and mitigate for end-users the prediction products obtained by combining so-called Ensemble Prediction Systems (EPS of Numerical Weather Prediction (NWP models with rainfall-runoff models. This paper presents initial results obtained by combining deterministic and EPS hindcasts of the global NWP model of the European Centre for Medium-Range Weather Forecasts (ECMWF with the large-scale hydrological model LISFLOOD for two historic flood events: the river Meuse flood in January 1995 and the river Odra flood in July 1997. In addition, a possible way to interpret the obtained ensemble based stream flow prediction is proposed.
Probabilistic Aspects in Spoken Document Retrieval
Directory of Open Access Journals (Sweden)
Macherey Wolfgang
2003-01-01
Full Text Available Accessing information in multimedia databases encompasses a wide range of applications in which spoken document retrieval (SDR plays an important role. In SDR, a set of automatically transcribed speech documents constitutes the files for retrieval, to which a user may address a request in natural language. This paper deals with two probabilistic aspects in SDR. The first part investigates the effect of recognition errors on retrieval performance and inquires the question of why recognition errors have only a little effect on the retrieval performance. In the second part, we present a new probabilistic approach to SDR that is based on interpolations between document representations. Experiments performed on the TREC-7 and TREC-8 SDR task show comparable or even better results for the new proposed method than other advanced heuristic and probabilistic retrieval metrics.
A Model-Driven Probabilistic Parser Generator
Quesada, Luis; Cortijo, Francisco J
2012-01-01
Existing probabilistic scanners and parsers impose hard constraints on the way lexical and syntactic ambiguities can be resolved. Furthermore, traditional grammar-based parsing tools are limited in the mechanisms they allow for taking context into account. In this paper, we propose a model-driven tool that allows for statistical language models with arbitrary probability estimators. Our work on model-driven probabilistic parsing is built on top of ModelCC, a model-based parser generator, and enables the probabilistic interpretation and resolution of anaphoric, cataphoric, and recursive references in the disambiguation of abstract syntax graphs. In order to prove the expression power of ModelCC, we describe the design of a general-purpose natural language parser.
Modal Specifications for Probabilistic Timed Systems
Directory of Open Access Journals (Sweden)
Tingting Han
2013-06-01
Full Text Available Modal automata are a classic formal model for component-based systems that comes equipped with a rich specification theory supporting abstraction, refinement and compositional reasoning. In recent years, quantitative variants of modal automata were introduced for specifying and reasoning about component-based designs for embedded and mobile systems. These respectively generalize modal specification theories for timed and probabilistic systems. In this paper, we define a modal specification language for combined probabilistic timed systems, called abstract probabilistic timed automata, which generalizes existing formalisms. We introduce appropriate syntactic and semantic refinement notions and discuss consistency of our specification language, also with respect to time-divergence. We identify a subclass of our models for which we define the fundamental operations for abstraction, conjunction and parallel composition, and show several compositionality results.
Probabilistic Modeling and Visualization for Bankruptcy Prediction
DEFF Research Database (Denmark)
Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara
2017-01-01
In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...
Probabilistic inversion for chicken processing lines
Energy Technology Data Exchange (ETDEWEB)
Cooke, Roger M. [Department of Mathematics, Delft University of Technology, Delft (Netherlands)]. E-mail: r.m.cooke@ewi.tudelft.nl; Nauta, Maarten [Microbiological Laboratory for Health Protection RIVM, Bilthoven (Netherlands); Havelaar, Arie H. [Microbiological Laboratory for Health Protection RIVM, Bilthoven (Netherlands); Fels, Ine van der [Microbiological Laboratory for Health Protection RIVM, Bilthoven (Netherlands)
2006-10-15
We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism.
Scalable group level probabilistic sparse factor analysis
DEFF Research Database (Denmark)
Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard
2017-01-01
Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...
Probabilistic Grammar: The view from Cognitive Sociolinguistics
Directory of Open Access Journals (Sweden)
Jeroen Claes
2017-06-01
Full Text Available In this paper, I propose that Probabilistic Grammar may benefit from incorporating theoretical insights from Cognitive (SocioLinguistics. I begin by introducing Cognitive Linguistics. Then, I propose a model of the domain-general cognitive constraints (markedness of coding, statistical preemption, and structural priming that condition language (variation. Subsequently, three case studies are presented that test the predictions of this model on three distinct alternations in English and Spanish (variable agreement with existential 'haber', variable agreement with existential 'there be', and Spanish subject pronoun expression. For each case study, the model generates empirically correct predictions. I conclude that, with the support of Cognitive Sociolinguistics, Probabilistic Grammar may move beyond description towards explanation. This article is part of the special collection: Probabilistic grammars: Syntactic variation in a comparative perspective
bayesPop: Probabilistic Population Projections
Directory of Open Access Journals (Sweden)
Hana Ševčíková
2016-12-01
Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.
Probabilistic Transcriptome Assembly and Variant Graph Genotyping
DEFF Research Database (Denmark)
Sibbesen, Jonas Andreas
the resulting sequencing data should be interpreted. This has over the years spurred the development of many probabilistic methods that are capable of modelling dierent aspects of the sequencing process. Here, I present two of such methods that were developed to each tackle a dierent problem in bioinformatics......, together with an application of the latter method to a large Danish sequencing project. The rst is a probabilistic method for transcriptome assembly that is based on a novel generative model of the RNA sequencing process and provides condence estimates on the assembled transcripts. We show...... that this approach outperforms existing state-of-the-art methods measured using sensitivity and precision on both simulated and real data. The second is a novel probabilistic method that uses exact alignment of k-mers to a set of variants graphs to provide unbiased estimates of genotypes in a population...
Probabilistic Forecasting of the Wave Energy Flux
DEFF Research Database (Denmark)
Pinson, Pierre; Reikard, G.; Bidlot, J.-R.
2012-01-01
markets. A methodology for the probabilistic forecasting of the wave energy flux is introduced, based on a log-Normal assumption for the shape of predictive densities. It uses meteorological forecasts (from the European Centre for Medium-range Weather Forecasts – ECMWF) and local wave measurements......Wave energy will certainly have a significant role to play in the deployment of renewable energy generation capacities. As with wind and solar, probabilistic forecasts of wave power over horizons of a few hours to a few days are required for power system operation as well as trading in electricity...... as input. The parameters of the models involved are adaptively and recursively estimated. The methodology is evaluated for 13 locations around North-America over a period of 15months. The issued probabilistic forecasts substantially outperform the various benchmarks considered, with improvements between 6...
A probabilistic view on the August 2005 floods in the upper Rhine catchment
Directory of Open Access Journals (Sweden)
S. Jaun
2008-04-01
Full Text Available Appropriate precautions in the case of flood occurrence often require long lead times (several days in hydrological forecasting. This in turn implies large uncertainties that are mainly inherited from the meteorological precipitation forecast. Here we present a case study of the extreme flood event of August 2005 in the Swiss part of the Rhine catchment (total area 34 550 km^{2}. This event caused tremendous damage and was associated with precipitation amounts and flood peaks with return periods beyond 10 to 100 years. To deal with the underlying intrinsic predictability limitations, a probabilistic forecasting system is tested, which is based on a hydrological-meteorological ensemble prediction system. The meteorological component of the system is the operational limited-area COSMO-LEPS that downscales the ECMWF ensemble prediction system to a horizontal resolution of 10 km, while the hydrological component is based on the semi-distributed hydrological model PREVAH with a spatial resolution of 500 m. We document the setup of the coupled system and assess its performance for the flood event under consideration.
We show that the probabilistic meteorological-hydrological ensemble prediction chain is quite effective and provides additional guidance for extreme event forecasting, in comparison to a purely deterministic forecasting system. For the case studied, it is also shown that most of the benefits of the probabilistic approach may be realized with a comparatively small ensemble size of 10 members.
Constraint Processing in Lifted Probabilistic Inference
Kisynski, Jacek
2012-01-01
First-order probabilistic models combine representational power of first-order logic with graphical models. There is an ongoing effort to design lifted inference algorithms for first-order probabilistic models. We analyze lifted inference from the perspective of constraint processing and, through this viewpoint, we analyze and compare existing approaches and expose their advantages and limitations. Our theoretical results show that the wrong choice of constraint processing method can lead to exponential increase in computational complexity. Our empirical tests confirm the importance of constraint processing in lifted inference. This is the first theoretical and empirical study of constraint processing in lifted inference.
The probabilistic approach to human reasoning.
Oaksford, M; Chater, N
2001-08-01
A recent development in the cognitive science of reasoning has been the emergence of a probabilistic approach to the behaviour observed on ostensibly logical tasks. According to this approach the errors and biases documented on these tasks occur because people import their everyday uncertain reasoning strategies into the laboratory. Consequently participants' apparently irrational behaviour is the result of comparing it with an inappropriate logical standard. In this article, we contrast the probabilistic approach with other approaches to explaining rationality, and then show how it has been applied to three main areas of logical reasoning: conditional inference, Wason's selection task and syllogistic reasoning.
Probabilistic Design of Wave Energy Devices
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.
2011-01-01
Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...
Probabilistic Durability Analysis in Advanced Engineering Design
Directory of Open Access Journals (Sweden)
A. Kudzys
2000-01-01
Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.
Probabilistic assessment of uncertain adaptive hybrid composites
Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.
1994-01-01
Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.
Quantum logic networks for probabilistic teleportation
Institute of Scientific and Technical Information of China (English)
刘金明; 张永生; 郭光灿
2003-01-01
By means of the primitive operations consisting of single-qubit gates, two-qubit controlled-not gates, Von Neuman measurement and classically controlled operations, we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit, atwo-particle entangled state, and an N-particle entanglement. Based on the quantum networks, we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.
Quantum logic networks for probabilistic teleportation
Institute of Scientific and Technical Information of China (English)
刘金明; 张永生; 等
2003-01-01
By eans of the primitive operations consisting of single-qubit gates.two-qubit controlled-not gates,Von Neuman measurement and classically controlled operations.,we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit,a two-particle entangled state,and an N-particle entanglement.Based on the quantum networks,we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.
Why are probabilistic laws governing quantum mechanics and neurobiology?
Kröger, H
2004-01-01
We address the question: Why are dynamical laws governing in quantum mechanics and in neuroscience of probabilistic nature instead of being deterministic? We discuss some ideas showing that the probabilistic option offers advantages over the deterministic one.
Why are probabilistic laws governing quantum mechanics and neurobiology?
Kröger, Helmut
2005-08-01
We address the question: Why are dynamical laws governing in quantum mechanics and in neuroscience of probabilistic nature instead of being deterministic? We discuss some ideas showing that the probabilistic option offers advantages over the deterministic one.
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
Energy Technology Data Exchange (ETDEWEB)
Cetiner, Mustafa Sacit; none,; Flanagan, George F. [ORNL; Poore III, Willis P. [ORNL; Muhlheim, Michael David [ORNL
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.
Framework for probabilistic flood risk assessment in an Alpine region
Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann
2014-05-01
Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the
A common fixed point for operators in probabilistic normed spaces
Energy Technology Data Exchange (ETDEWEB)
Ghaemi, M.B. [Faculty of Mathematics, Iran University of Science and Technology, Narmak, Tehran (Iran, Islamic Republic of)], E-mail: mghaemi@iust.ac.ir; Lafuerza-Guillen, Bernardo [Department of Applied Mathematics, University of Almeria, Almeria (Spain)], E-mail: blafuerz@ual.es; Razani, A. [Department of Mathematics, Faculty of Science, I. Kh. International University, P.O. Box 34194-288, Qazvin (Iran, Islamic Republic of)], E-mail: razani@ikiu.ac.ir
2009-05-15
Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.0.
Methodology for rainwater reservoir dimensioning: a probabilistic approach
Directory of Open Access Journals (Sweden)
Wagner Wolff
2017-05-01
Full Text Available The aim of this study was to propose a new methodology for reservoir rainwater dimensioning based on probabilistic modeling. Eucalyptus seedlings grown in a greenhouse were used to obtain a hypothetical water demand. Meteorological data were used to estimate the demand (evapotranspiration and offer (rainfall over the greenhouse coverage. The probability distribution of Wakeby presented the best fit for the rainfall data; therefore, a Wakeby distribution was used to model the flow-duration curve of the greenhouse coverage. For a payback period (T of 10 years of surplus water demand and water supply deficit, a reservoir with 13.60 m³ was obtained. The proposed methodology combined the simultaneous occurrence of the events to enable the scaling out of a reservoir with high safety to supply the required demand (T = 100 years and therefore enables a lower cost of deployment compared to each approach separately (T = 10 years.
The Yucca Mountain probabilistic volcanic hazard analysis project
Energy Technology Data Exchange (ETDEWEB)
Coppersmith, K.J.; Perman, R.C.; Youngs, R.R. [Geomatrix Consultants, Inc., San Francisco, CA (United States)] [and others
1996-12-01
The Probabilistic Volcanic Hazard Analysis (PVHA) project, sponsored by the U.S. Department of Energy (DOE), was conducted to assess the probability of a future volcanic event disrupting the potential repository at Yucca Mountain. The PVHA project is one of the first major expert judgment studies that DOE has authorized for technical assessments related to the Yucca Mountain project. The judgments of members of a ten-person expert panel were elicited to ensure that a wide range of approaches were considered for the hazard analysis. The results of the individual elicitations were then combined to develop an integrated assessment of the volcanic hazard that reflects the diversity of alternative scientific interpretations. This assessment, which focused on the volcanic hazard at the site, expressed as the probability of disruption of the potential repository, will provide input to an assessment of volcanic risk, which expresses the probability of radionuclide release due to volcanic disruption.
Results of the probabilistic volcanic hazard analysis project
Energy Technology Data Exchange (ETDEWEB)
Youngs, R.; Coppersmith, K.J.; Perman, R.C. [Geomatrix Consultants, Inc., San Francisco, CA (United States)
1996-12-01
The Probabilistic Volcanic Hazard Analysis (PVHA) project, sponsored by the U.S. Department of Energy (DOE), has been conducted to assess the probability of a future volcanic event disrupting the potential repository at Yucca Mountain. The methodology for the PVHA project is summarized in Coppersmith and others (this volume). The judgments of ten earth scientists who were members of an expert panel were elicited to ensure that a wide range of approaches were considered. Each expert identified one or more approaches for assessing the hazard and they quantified their uncertainties in models and parameter values. Aggregated results are expressed as a probability distribution on the annual frequency of intersecting the proposed repository block. This paper presents some of the key results of the PVHA assessments. These results are preliminary; the final report for the study is planned to be submitted to DOE in April 1996.
Verifying Automata Specification of Distributed Probabilistic Real—Time Systems
Institute of Scientific and Technical Information of China (English)
罗铁庚; 陈火旺; 等
1998-01-01
In this paper,a qualitative model checking algorithm for verification of distributed probabilistic real-time systems(DPRS)is presented.The model of DPRS,called real-time proba bilistic process model(RPPM),is over continuous time domain.The properties of DPRS are described by using deterministic timed automata(DTA).The key part in the algorithm is to map continuous time to finite time intervals with flag variables.Compared with the existing algorithms,this algorithm uses more general delay time equivalence classes instead of the unit delay time equivalence classes restricted by event sequence,and avoids generating the equivalence classes of states only due to the passage of time.The result shows that this algorithm is cheaper.
The New Algorithm for Fast Probabilistic Hypocenter Locations
Dębski, Wojciech; Klejment, Piotr
2016-12-01
The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analysed in many branches of physics, including seismology, oceanology, to name a few. It is well recognised that there is no single universal location algorithm which performs equally well in all situations. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms. In this paper we propose a new location algorithm which exploits the reciprocity and time-inverse invariance property of the wave equation. Basing on these symmetries and using a modern finite-difference-type eikonal solver, we have developed a new very fast algorithm performing the full probabilistic (Bayesian) source location. We illustrate an efficiency of the algorithm performing an advanced error analysis for 1647 seismic events from the Rudna copper mine operating in southwestern Poland.
Probabilistic safety assessment in the chemical and nuclear industries
Fullwood, Ralph R
2000-01-01
Probabilistic Safety Analysis (PSA) determines the probability and consequences of accidents, hence, the risk. This subject concerns policy makers, regulators, designers, educators and engineers working to achieve maximum safety with operational efficiency. Risk is analyzed using methods for achieving reliability in the space program. The first major application was to the nuclear power industry, followed by applications to the chemical industry. It has also been applied to space, aviation, defense, ground, and water transportation. This book is unique in its treatment of chemical and nuclear risk. Problems are included at the end of many chapters, and answers are in the back of the book. Computer files are provided (via the internet), containing reliability data, a calculator that determines failure rate and uncertainty based on field experience, pipe break calculator, event tree calculator, FTAP and associated programs for fault tree analysis, and a units conversion code. It contains 540 references and many...
Energy Technology Data Exchange (ETDEWEB)
Marchal, C. [ASN, 75 - Paris (France); Garandel, S.; Godet, J.L. [Angers Univ., 49 (France)
2009-10-15
The declaration of events in the perspective of a experience feedback towards the professionals contributes to improve the treatments safety. The stakes in term of treatments safety is in the capitalization of these experience feedbacks from declarations. Well, the publication of regulatory obligations in matter of quality assurance and their appropriation by the whole of medical teams should improve the treatment safety and and increase their quality. To the end, without being unaware of the current difficulties encountered y some centers because of the lack of radio physicists, the approach of significant events declaration and risk analysis should allow to the french radiotherapy to become an international reference. (N.C.)
Probabilistic programming: a true vedification Challenge
Katoen, Joost-Pieter; Finkbeiner, Bernd; Pu, Geguang; Zhang, Lijun
2015-01-01
Probabilistic programs [6] are sequential programs, written in languages like C, Java, Scala, or ML, with two added constructs: (1) the ability to draw values at random from probability distributions, and (2) the ability to condition values of variables in a program through observations. For a compr
Probabilistic methods for service life predictions
Siemes, A.J.M.
1999-01-01
Nowadays it is commonly accepted that the safety of structures should be expressed in terms of reli-ability. This means as the probability of failure. In literature [1, 2, 3, and 4] the bases have been given for the calculation of the failure probability. Making probabilistic calculations can be don
Probabilistic Meteorological Characterization for Turbine Loads
DEFF Research Database (Denmark)
Kelly, Mark C.; Larsen, Gunner Chr.; Dimitrov, Nikolay Krasimirov;
2014-01-01
Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface....... These are used as input to loads calculation, and with a statistical loads output description, they allow for improved design and loads calculations....
On Probabilistic Automata in Continuous Time
DEFF Research Database (Denmark)
Eisentraut, Christian; Hermanns, Holger; Zhang, Lijun
2010-01-01
their compositionality properties. Weak bisimulation is partly oblivious to the probabilistic branching structure, in order to reflect some natural equalities in this spectrum of models. As a result, the standard way to associate a stochastic process to a generalised stochastic Petri net can be proven sound with respect...
Pigeons' Discounting of Probabilistic and Delayed Reinforcers
Green, Leonard; Myerson, Joel; Calvert, Amanda L.
2010-01-01
Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…
Enhancing Automated Test Selection in Probabilistic Networks
Sent, D.; van der Gaag, L.C.; Bellazzi, R; Abu-Hanna, A; Hunter, J
2007-01-01
Most test-selection algorithms currently in use with probabilistic networks select variables myopically, that is, test variables are selected sequentially, on a one-by-one basis, based upon expected information gain. While myopic test selection is not realistic for many medical applications, non-myo
Relevance feedback in probabilistic multimedia retrieval
Boldareva, L.; Hiemstra, D.; Jonker, W.
2003-01-01
In this paper we explore a new view on data organisation and retrieval in a (multimedia) collection. We use probabilistic framework for indexing and interactive retrieval of the data, which enable to fill the semantic gap. Semi-automated experiments with TREC-2002 video collection showed that our ap
Sampling Techniques for Probabilistic Roadmap Planners
Geraerts, R.J.; Overmars, M.H.
2004-01-01
The probabilistic roadmap approach is a commonly used motion planning technique. A crucial ingredient of the approach is a sampling algorithm that samples the configuration space of the moving object for free configurations. Over the past decade many sampling techniques have been proposed. It is
Probabilistic Damage Stability Calculations for Ships
DEFF Research Database (Denmark)
Jensen, Jørgen Juncher
1996-01-01
The aim of these notes is to provide background material for the present probabilistic damage stability rules fro dry cargo ships.The formulas for the damage statistics are derived and shortcomings as well as possible improvements are discussed. The advantage of the definiton of fictitious...
Strong Ideal Convergence in Probabilistic Metric Spaces
Indian Academy of Sciences (India)
Celaleddin Şençimen; Serpil Pehlivan
2009-06-01
In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this space and investigate some properties of these concepts.
Financial Markets Analysis by Probabilistic Fuzzy Modelling
J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)
2003-01-01
textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno (
Probabilistic decision graphs for optimization under uncertainty
DEFF Research Database (Denmark)
Jensen, Finn V.; Nielsen, Thomas Dyhre
2011-01-01
This paper provides a survey on probabilistic decision graphs for modeling and solving decision problems under uncertainty. We give an introduction to influence diagrams, which is a popular framework for representing and solving sequential decision problems with a single decision maker. As the me...
Relevance feedback in probabilistic multimedia retrieval
Boldareva, L.; Hiemstra, Djoerd; Jonker, Willem
2003-01-01
In this paper we explore a new view on data organisation and retrieval in a (multimedia) collection. We use probabilistic framework for indexing and interactive retrieval of the data, which enable to fill the semantic gap. Semi-automated experiments with TREC-2002 video collection showed that our ap
Probabilistic safety goals. Phase 3 - Status report
Energy Technology Data Exchange (ETDEWEB)
Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))
2009-07-15
The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)
Probabilistic Resource Analysis by Program Transformation
DEFF Research Database (Denmark)
Kirkeby, Maja Hanne; Rosendahl, Mads
2016-01-01
The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...
A Probabilistic Framework for Curve Evolution
DEFF Research Database (Denmark)
Dahl, Vedrana Andersen
2017-01-01
approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...
A Comparative Study of Probabilistic Roadmap Planners
Geraerts, R.J.; Overmars, M.H.
2004-01-01
The probabilistic roadmap approach is one of the leading motion planning techniques. Over the past eight years the technique has been studied by many different researchers. This has led to a large number of variants of the approach, each with its own merits. It is difficult to compare the different
Dialectical Multivalued Logic and Probabilistic Theory
Directory of Open Access Journals (Sweden)
José Luis Usó Doménech
2017-02-01
Full Text Available There are two probabilistic algebras: one for classical probability and the other for quantum mechanics. Naturally, it is the relation to the object that decides, as in the case of logic, which algebra is to be used. From a paraconsistent multivalued logic therefore, one can derive a probability theory, adding the correspondence between truth value and fortuity.
Mastering probabilistic graphical models using Python
Ankan, Ankur
2015-01-01
If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.
Balkanization and Unification of Probabilistic Inferences
Yu, Chong-Ho
2005-01-01
Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…
Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon
Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.
2015-12-01
Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction
From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model
Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.
2014-12-01
European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.
Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions
Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.
2009-01-01
Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.
External radiation surveillance
Energy Technology Data Exchange (ETDEWEB)
Antonio, E.J.
1995-06-01
This section of the 1994 Hanford Site Environmental Report describes how external radiation was measured, how surveys were performed, and the results of these measurements and surveys. External radiation exposure rates were measured at locations on and off the Hanford Site using thermoluminescent dosimeters (TLD). External radiation and contamination surveys were also performed with portable radiation survey instruments at locations on and around the Hanford Site.
Directory of Open Access Journals (Sweden)
Manuel Schölling
Full Text Available Our current view on how protein complexes assemble and disassemble at promoter sites is based on experimental data. For instance this data is provided by biochemical methods (e.g. ChIP-on-chip assays or GFP-based assays. These two approaches suggest very different characteristics for protein recruitment processes that regulate and initiate gene transcription. Biochemical methods suggest a strictly ordered and consecutive protein recruitment while GFP-based assays draw a picture much closer to chaotic or stochastic recruitment. To understand the reason for these conflicting results, we design a generalized recruitment model (GRM that allows to simulate all possible scenarios between strictly sequential recruitment and completely probabilistic recruitment. With this model we show that probabilistic, transient binding events that are visible in GFP experiments can not be detected by ChIP experiments. We demonstrate that sequential recruitment processes and probabilistic recruitment processes that contain "shortcuts" exhibit periodic dynamics and are hard to distinguish with standard ChIP measurements. Therefore we propose a simple experimental method that can be used to discriminate sequential from probabilistic recruitment processes. We discuss the limitations of this method.
Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
Probabilistic thinking is very important in human life especially in responding to situation which possibly occured or situation containing uncertainty elements. It is necessary to develop students' probabilistic thinking since in elementary school by teaching probability. Based on mathematics curriculum in Indonesia, probability is firstly introduced to ninth grade students. Though, some research showed that low-grade students were successful in solving probability tasks, even in pre school. This study is aimed to explore students' probabilistic thinking of elementary school; high and low math ability in solving probability tasks. Qualitative approach was chosen to describe in depth related to students' probabilistic thinking. The results showed that high and low math ability students were difference in responding to 1 and 2 dimensional sample space tasks, and probability comparison tasks of drawing marker and contextual. Representation used by high and low math ability students were also difference in responding to contextual probability of an event task and probability comparison task of rotating spinner. This study is as reference to mathematics curriculum developers of elementary school in Indonesia. In this case to introduce probability material and teach probability through spinner, as media in learning.
Energy Technology Data Exchange (ETDEWEB)
Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.
1998-04-01
The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.
Deterministic and Probabilistic Analysis against Anticipated Transient Without Scram
Energy Technology Data Exchange (ETDEWEB)
Choi, Sun Mi; Kim, Ji Hwan [KHNP Central Research Institute, Daejeon (Korea, Republic of); Seok, Ho [KEPCO Engineering and Construction, Daejeon (Korea, Republic of)
2016-10-15
An Anticipated Transient Without Scram (ATWS) is an Anticipated Operational Occurrences (AOOs) accompanied by a failure of the reactor trip when required. By a suitable combination of inherent characteristics and diverse systems, the reactor design needs to reduce the probability of the ATWS and to limit any Core Damage and prevent loss of integrity of the reactor coolant pressure boundary if it happens. This study focuses on the deterministic analysis for the ATWS events with respect to Reactor Coolant System (RCS) over-pressure and fuel integrity for the EU-APR. Additionally, this report presents the Probabilistic Safety Assessment (PSA) reflecting those diverse systems. The analysis performed for the ATWS event indicates that the NSSS could be reached to controlled and safe state due to the addition of boron into the core via the EBS pump flow upon the EBAS by DPS. Decay heat is removed through MSADVs and the auxiliary feedwater. During the ATWS event, RCS pressure boundary is maintained by the operation of primary and secondary safety valves. Consequently, the acceptance criteria were satisfied by installing DPS and EBS in addition to the inherent safety characteristics.
Probabilistic risk assessment of the modular HTGR plant. Revision 1
Energy Technology Data Exchange (ETDEWEB)
Everline, C.J.; Bellis, E.A.; Vasquez, J.
1986-06-01
A preliminary probabilistic risk assessment (PRA) has been performed for the modular HTGR (MHTGR). This PRA is preliminary in the context that although it updates the PRA issued earlier to include a wider spectrum of events for Licensing Basis Events (LBE) selection, the final version will not be issued until later. The primary function of the assessment was to assure compliance with the NRC interim safety goals imposed by the top-level regulatory criteria, and utility/user requirements regarding public evacuation or sheltering. In addition, the assessment provides a basis for designer feedback regarding reliability allocations and barrier retention requirements as well as providing a basis for the selection of licensing basis events (LBEs) and the safety classification of structures, systems, and components. The assessment demonstrates that both the NRC interim safety goals and utility/user imposed sheltering/evacuation requirements are satisfied. Moreover, it is not anticipated that design changes introduced will jeopardize compliance with the interim safety goals or utility/user requirements. 61 refs., 48 figs., 24 tabs.
Probabilistic approach to earthquake prediction.
Directory of Open Access Journals (Sweden)
G. D'Addezio
2002-06-01
Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the
Probabilistic evaluation of decadal prediction skill regarding Northern Hemisphere winter storms
Directory of Open Access Journals (Sweden)
Tim Kruschke
2016-12-01
Full Text Available Winter wind storms related to intense extra-tropical cyclones are meteorological extreme events, often with major impacts on economy and human life, especially for Europe and the mid-latitudes. Hence, skillful decadal predictions regarding the frequency of their occurrence would be of great socio-economic value. The present paper extends the study of Kruschke et al. (2014 in several aspects. First, this study is situated in a more impact oriented context by analyzing the frequency of potentially damaging wind storm events instead of targeting at cyclones as general meteorological features which was done by Kruschke et al. (2014. Second, this study incorporates more data sets by analyzing five decadal hindcast experiments – 41 annual (1961–2001 initializations integrated for ten years each – set up with different initialization strategies. However, all experiments are based on the Max-Planck-Institute Earth System Model in a low-resolution configuration (MPI-ESM-LR. Differing combinations of these five experiments allow for more robust estimates of predictive skill (due to considerably larger ensemble size and systematic comparisons of the underlying initialization strategies. Third, the hindcast experiments are corrected for model bias and potential drifts over lead time by means of a novel parametric approach, accounting for non-stationary model drifts. We analyze whether skillful probabilistic three-category forecasts (enhanced, normal or decreased can be provided regarding winter (ONDJFM wind storm frequencies over the Northern Hemisphere (NH. Skill is assessed by using climatological probabilities and uninitialized transient simulations as reference forecasts. It is shown that forecasts of average winter wind storm frequencies for winters 2–5 and winters 2–9 are skillful over large parts of the NH. However, most of this skill is associated with external forcing from transient greenhouse gas and aerosol concentrations
PICS: probabilistic inference for ChIP-seq.
Zhang, Xuekui; Robertson, Gordon; Krzywinski, Martin; Ning, Kaida; Droit, Arnaud; Jones, Steven; Gottardo, Raphael
2011-03-01
ChIP-seq combines chromatin immunoprecipitation with massively parallel short-read sequencing. While it can profile genome-wide in vivo transcription factor-DNA association with higher sensitivity, specificity, and spatial resolution than ChIP-chip, it poses new challenges for statistical analysis that derive from the complexity of the biological systems characterized and from variability and biases in its sequence data. We propose a method called PICS (Probabilistic Inference for ChIP-seq) for identifying regions bound by transcription factors from aligned reads. PICS identifies binding event locations by modeling local concentrations of directional reads, and uses DNA fragment length prior information to discriminate closely adjacent binding events via a Bayesian hierarchical t-mixture model. It uses precalculated, whole-genome read mappability profiles and a truncated t-distribution to adjust binding event models for reads that are missing due to local genome repetitiveness. It estimates uncertainties in model parameters that can be used to define confidence regions on binding event locations and to filter estimates. Finally, PICS calculates a per-event enrichment score relative to a control sample, and can use a control sample to estimate a false discovery rate. Using published GABP and FOXA1 data from human cell lines, we show that PICS' predicted binding sites were more consistent with computationally predicted binding motifs than the alternative methods MACS, QuEST, CisGenome, and USeq. We then use a simulation study to confirm that PICS compares favorably to these methods and is robust to model misspecification.
Havlenová, Tereza
2016-01-01
Event is an experience that is perceived by all the senses. Event management is a process involving the various activities that are assigned to staffers. Organizing special events became an individual field. If the manager understand the events as a communication platform gets into the hands of a modern, multifunctional and very impressive tool. The procedure to implement a successful event in a particular area is part of this work. The first part explains the issues of event management on th...
Bayesian analysis of rare events
Energy Technology Data Exchange (ETDEWEB)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Agent-Oriented Probabilistic Logic Programming
Institute of Scientific and Technical Information of China (English)
Jie Wang; Shi-Er Ju; Chun-Nian Liu
2006-01-01
Currently, agent-based computing is an active research area, and great efforts have been made towards the agent-oriented programming both from a theoretical and practical view. However, most of them assume that there is no uncertainty in agents' mental state and their environment. In other words, under this assumption agent developers are just allowed to specify how his agent acts when the agent is 100% sure about what is true/false. In this paper, this unrealistic assumption is removed and a new agent-oriented probabilistic logic programming language is proposed, which can deal with uncertain information about the world. The programming language is based on a combination of features of probabilistic logic programming and imperative programming.
Probabilistic forecasting and Bayesian data assimilation
Reich, Sebastian
2015-01-01
In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...
Probabilistic Parsing Using Left Corner Language Models
Manning, C D; Manning, Christopher D.; Carpenter, Bob
1997-01-01
We introduce a novel parser based on a probabilistic version of a left-corner parser. The left-corner strategy is attractive because rule probabilities can be conditioned on both top-down goals and bottom-up derivations. We develop the underlying theory and explain how a grammar can be induced from analyzed data. We show that the left-corner approach provides an advantage over simple top-down probabilistic context-free grammars in parsing the Wall Street Journal using a grammar induced from the Penn Treebank. We also conclude that the Penn Treebank provides a fairly weak testbed due to the flatness of its bracketings and to the obvious overgeneration and undergeneration of its induced grammar.
Probabilistic Universality in two-dimensional Dynamics
Lyubich, Mikhail
2011-01-01
In this paper we continue to explore infinitely renormalizable H\\'enon maps with small Jacobian. It was shown in [CLM] that contrary to the one-dimensional intuition, the Cantor attractor of such a map is non-rigid and the conjugacy with the one-dimensional Cantor attractor is at most 1/2-H\\"older. Another formulation of this phenomenon is that the scaling structure of the H\\'enon Cantor attractor differs from its one-dimensional counterpart. However, in this paper we prove that the weight assigned by the canonical invariant measure to these bad spots tends to zero on microscopic scales. This phenomenon is called {\\it Probabilistic Universality}. It implies, in particular, that the Hausdorff dimension of the canonical measure is universal. In this way, universality and rigidity phenomena of one-dimensional dynamics assume a probabilistic nature in the two-dimensional world.
Lipschitz Parametrization of Probabilistic Graphical Models
Honorio, Jean
2012-01-01
We show that the log-likelihood of several probabilistic graphical models is Lipschitz continuous with respect to the lp-norm of the parameters. We discuss several implications of Lipschitz parametrization. We present an upper bound of the Kullback-Leibler divergence that allows understanding methods that penalize the lp-norm of differences of parameters as the minimization of that upper bound. The expected log-likelihood is lower bounded by the negative lp-norm, which allows understanding the generalization ability of probabilistic models. The exponential of the negative lp-norm is involved in the lower bound of the Bayes error rate, which shows that it is reasonable to use parameters as features in algorithms that rely on metric spaces (e.g. classification, dimensionality reduction, clustering). Our results do not rely on specific algorithms for learning the structure or parameters. We show preliminary results for activity recognition and temporal segmentation.
Efficient Probabilistic Inference with Partial Ranking Queries
Huang, Jonathan; Guestrin, Carlos E
2012-01-01
Distributions over rankings are used to model data in various settings such as preference analysis and political elections. The factorial size of the space of rankings, however, typically forces one to make structural assumptions, such as smoothness, sparsity, or probabilistic independence about these underlying distributions. We approach the modeling problem from the computational principle that one should make structural assumptions which allow for efficient calculation of typical probabilistic queries. For ranking models, "typical" queries predominantly take the form of partial ranking queries (e.g., given a user's top-k favorite movies, what are his preferences over remaining movies?). In this paper, we argue that riffled independence factorizations proposed in recent literature [7, 8] are a natural structural assumption for ranking distributions, allowing for particularly efficient processing of partial ranking queries.
Probabilistic Dynamic Logic of Phenomena and Cognition
Vityaev, Evgenii; Perlovsky, Leonid; Smerdov, Stanislav
2011-01-01
The purpose of this paper is to develop further the main concepts of Phenomena Dynamic Logic (P-DL) and Cognitive Dynamic Logic (C-DL), presented in the previous paper. The specific character of these logics is in matching vagueness or fuzziness of similarity measures to the uncertainty of models. These logics are based on the following fundamental notions: generality relation, uncertainty relation, simplicity relation, similarity maximization problem with empirical content and enhancement (learning) operator. We develop these notions in terms of logic and probability and developed a Probabilistic Dynamic Logic of Phenomena and Cognition (P-DL-PC) that relates to the scope of probabilistic models of brain. In our research the effectiveness of suggested formalization is demonstrated by approximation of the expert model of breast cancer diagnostic decisions. The P-DL-PC logic was previously successfully applied to solving many practical tasks and also for modelling of some cognitive processes.
A probabilistic model of RNA conformational space
DEFF Research Database (Denmark)
Frellsen, Jes; Moltke, Ida; Thiim, Martin;
2009-01-01
The increasing importance of non-coding RNA in biology and medicine has led to a growing interest in the problem of RNA 3-D structure prediction. As is the case for proteins, RNA 3-D structure prediction methods require two key ingredients: an accurate energy function and a conformational sampling......, the discrete nature of the fragments necessitates the use of carefully tuned, unphysical energy functions, and their non-probabilistic nature impairs unbiased sampling. We offer a solution to the sampling problem that removes these important limitations: a probabilistic model of RNA structure that allows...... conformations for 9 out of 10 test structures, solely using coarse-grained base-pairing information. In conclusion, the method provides a theoretical and practical solution for a major bottleneck on the way to routine prediction and simulation of RNA structure and dynamics in atomic detail....
Significance testing as perverse probabilistic reasoning
Directory of Open Access Journals (Sweden)
Westover Kenneth D
2011-02-01
Full Text Available Abstract Truth claims in the medical literature rely heavily on statistical significance testing. Unfortunately, most physicians misunderstand the underlying probabilistic logic of significance tests and consequently often misinterpret their results. This near-universal misunderstanding is highlighted by means of a simple quiz which we administered to 246 physicians at two major academic hospitals, on which the proportion of incorrect responses exceeded 90%. A solid understanding of the fundamental concepts of probability theory is becoming essential to the rational interpretation of medical information. This essay provides a technically sound review of these concepts that is accessible to a medical audience. We also briefly review the debate in the cognitive sciences regarding physicians' aptitude for probabilistic inference.
Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool
Energy Technology Data Exchange (ETDEWEB)
Chang, Y.H.; Mosleh, A.; Dang, V.N
2003-03-01
The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)
An application of probabilistic safety assessment methods to model aircraft systems and accidents
Energy Technology Data Exchange (ETDEWEB)
Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.
1998-08-01
A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.
Probabilistic, meso-scale flood loss modelling
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2016-04-01
Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.
Incorporating psychological influences in probabilistic cost analysis
Energy Technology Data Exchange (ETDEWEB)
Kujawski, Edouard; Alvaro, Mariana; Edwards, William
2004-01-08
Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for
Probabilistic Output Analysis by Program Manipulation
Rosendahl, Mads; Kirkeby, Maja H.
2015-01-01
The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability function as a possibly uncomputable expression in an intermediate language. This program is then analyzed, transformed, and approximated. The result is a closed form expression that computes an over...
Learning Probabilistic Models of Word Sense Disambiguation
Pedersen, Ted
1998-01-01
This dissertation presents several new methods of supervised and unsupervised learning of word sense disambiguation models. The supervised methods focus on performing model searches through a space of probabilistic models, and the unsupervised methods rely on the use of Gibbs Sampling and the Expectation Maximization (EM) algorithm. In both the supervised and unsupervised case, the Naive Bayesian model is found to perform well. An explanation for this success is presented in terms of learning rates and bias-variance decompositions.
Bayesian Probabilistic Projection of International Migration.
Azose, Jonathan J; Raftery, Adrian E
2015-10-01
We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model.
A Probabilistic Approach to Knowledge Translation
Jiang, Shangpu; Lowd, Daniel; Dou, Dejing
2015-01-01
In this paper, we focus on a novel knowledge reuse scenario where the knowledge in the source schema needs to be translated to a semantically heterogeneous target schema. We refer to this task as "knowledge translation" (KT). Unlike data translation and transfer learning, KT does not require any data from the source or target schema. We adopt a probabilistic approach to KT by representing the knowledge in the source schema, the mapping between the source and target schemas, and the resulting ...
PRISMATIC: Unified Hierarchical Probabilistic Verification Tool
2011-09-01
Lecture Notes in Computer Science 5123, pp. 135–148...and Peyronnet, S., “Approximate Probabilistic Model Checking,”. in VMCAI, Vol. 2937 of Lecture Notes in Computer Science , pp. 73–84, 2004. 18 Hermanns...Systems,” in TACAS, Vol. 3920 of Lecture Notes in Computer Science , pp. 441–444, 2006. 20 Jiménez, V.M., Marzal, A., “Computing the k shortest paths:
Probabilistic and quantum finite automata with postselection
Yakaryilmaz, Abuzer
2011-01-01
We prove that endowing a real-time probabilistic or quantum computer with the ability of postselection increases its computational power. For this purpose, we provide a new model of finite automata with postselection, and compare it with the model of L\\={a}ce et al. We examine the related language classes, and also establish separations between the classical and quantum versions, and between the zero-error vs. bounded-error modes of recognition in this model.
Utilizing Probabilistic Linear Equations in Cube Attacks
Institute of Scientific and Technical Information of China (English)
Yuan Yao; Bin Zhang; Wen-Ling Wu
2016-01-01
Cube attacks, proposed by Dinur and Shamir at EUROCRYPT 2009, have shown huge power against stream ciphers. In the original cube attacks, a linear system of secret key bits is exploited for key recovery attacks. However, we find a number of equations claimed linear in previous literature actually nonlinear and not fit into the theoretical framework of cube attacks. Moreover, cube attacks are hard to apply if linear equations are rare. Therefore, it is of significance to make use of probabilistic linear equations, namely nonlinear superpolys that can be approximated by linear expressions effectively. In this paper, we suggest a way to test out and utilize these probabilistic linear equations, thus extending cube attacks to a wider scope. Concretely, we employ the standard parameter estimation approach and the sequential probability ratio test (SPRT) for linearity test in the preprocessing phase, and use maximum likelihood decoding (MLD) for solving the probabilistic linear equations in the online phase. As an application, we exhibit our new attack against 672 rounds of Trivium and reduce the number of key bits to search by 7.
Symbolic Computing in Probabilistic and Stochastic Analysis
Directory of Open Access Journals (Sweden)
Kamiński Marcin
2015-12-01
Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.
Probabilistic Graph Layout for Uncertain Network Visualization.
Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel
2017-01-01
We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.
Designing and rehabilitating concrete structures - probabilistic approach
Energy Technology Data Exchange (ETDEWEB)
Edvardsen, C.; Mohr, L. [COWI Consulting Engineers and Planners AS, Lyngby (Denmark)
2000-07-01
Four examples dealing with corrosion of steel reinforcement in concrete due to chloride ingress are described, using a probabilistic approach which was developed in the recently published DuraCrete Report. The first example illustrates the difference in the required concrete cover dictated by environmental considerations. The second example concerns the update of the service life of the Great Belt Link in Denmark on the basis of measurements made five years after construction. The third example provides some design details of a tunnel in the Netherlands, while the fourth one concerns design of a column taking into account the initiation of corrosion both by means of a partial safety factor and by a probabilistic analysis. Differences in using the probabilistic approach in designing a new structure where the service life and reliability are pre-determined, and rehabilitating an existing structure where an analysis may give the answer to an estimate of the remaining service life and reliability level, are demonstrated. 9 refs., 8 tabs., 6 figs.
Sentiment Detection of Web Users Using Probabilistic Latent Semantic Analysis
Directory of Open Access Journals (Sweden)
Weijian Ren
2014-10-01
Full Text Available With the wide application of Internet in almost all fields, it has become the most important way for information publication, providing a large number of channels for spreading public opinion. Public opinions, as the response of Internet users to the information such as social events and government policies, reflect the status of both society and economics, which is highly valuable for the decision-making and public relations of enterprises. At present, the analysis methods for Internet public opinion are mainly based on discriminative approaches, such as Support Vector Machine (SVM and neural network. However, when these approaches analyze the sentiment of Internet public opinion, they are failed to exploit information hidden in text, e.g. topic. Motivated by the above observation, this paper proposes a detection method for public sentiment based on Probabilistic Latent Semantic Analysis (PLSA model. PLSA inherits the advantages of LSA, exploiting the semantic topic hidden in data. The procedure of detecting the public sentiment using this algorithm is composed of three main steps: (1 Chinese word segmentation and word refinement, with which each document is represented by a bag of words; (2 modeling the probabilistic distribution of documents using PLSA; (3 using the Z-vector of PLSA as the features of documents and delivering it to SVM for sentiment detection. We collect a set of text data from Weibo, blog, BBS etc. to evaluate our proposed approach. The experimental results shows that the proposed method in this paper can detect the public sentiment with high accuracy, outperforming the state-of-the-art approaches, i.e., word histogram based approach. The results also suggest that, text semantic analysis using PLSA could significantly boost the sentiment detection
Probabilistic models for assessment of extreme temperatures and relative humidity in Lithuania
Alzbutas, Robertas; Šeputytė, Ilona
2015-04-01
Extreme temperatures are fairly common natural phenomenon in Lithuania. They have mainly negative effects both on the environment and humans. Thus there are important to perform probabilistic and statistical analyzes of possibly extreme temperature values and their time-dependant changes. This is especially important in areas where technical objects (sensitive to the extreme temperatures) are foreseen to be constructed. In order to estimate the frequencies and consequences of possible extreme temperatures, the probabilistic analysis of the event occurrence and its uncertainty has been performed: statistical data have been collected and analyzed. The probabilistic analysis of extreme temperatures in Lithuanian territory is based on historical data taken from Lithuanian Hydrometeorology Service, Dūkštas Meteorological Station, Lithuanian Energy Institute and Ignalina NNP Environmental Protection Department of Environmental Monitoring Service. The main objective of performed work was the probabilistic assessment of occurrence and impact of extreme temperature and relative humidity occurring in whole Lithuania and specifically in Dūkštas region where Ignalina Nuclear Power Plant is closed for decommissioning. In addition, the other purpose of this work was to analyze the changes of extreme temperatures. The probabilistic analysis of extreme temperatures increase in Lithuanian territory was based on more than 50 years historical data. The probabilistic assessment was focused on the application and comparison of Gumbel, Weibull and Generalized Value (GEV) distributions, enabling to select a distribution, which has the best fit for data of extreme temperatures. In order to assess the likelihood of extreme temperatures different probabilistic models were applied to evaluate the probability of exeedance of different extreme temperatures. According to the statistics and the relationship between return period and probabilities of temperatures the return period for 30
Directory of Open Access Journals (Sweden)
Paola Bianucci
2015-03-01
Full Text Available A useful tool is proposed in this paper to assist dam managers in comparing and selecting suitable operating rules. This procedure is based on well-known multiobjective and probabilistic methodologies, which were jointly applied here to assess and compare flood control strategies in hydropower reservoirs. The procedure consisted of evaluating the operating rules’ performance using a simulation fed by a representative and sufficiently large flood event series. These flood events were obtained from a synthetic rainfall series stochastically generated by using the RainSimV3 model coupled with a deterministic hydrological model. The performance of the assessed strategies was characterized using probabilistic variables. Finally, evaluation and comparison were conducted by analyzing objective functions which synthesize different aspects of the rules’ performance. These objectives were probabilistically defined in terms of risk and expected values. To assess the applicability and flexibility of the tool, it was implemented in a hydropower dam located in Galicia (Northern Spain. This procedure allowed alternative operating rule to be derived which provided a reasonable trade-off between dam safety, flood control, operability and energy production.
BET-VH: A Probabilistic Tool for Long- and Short-Term Volcanic Hazard Assessment
Marzocchi, W.; Selva, J.; Sandri, L.
2005-12-01
The purpose of this work is to present the probabilistic code BET-VH (Bayesian Event Tree for Volcanic Hazard) for long- and short-term volcanic hazard assessment. BET-VH follows the probabilistic scheme recently published by Marzocchi et al. (2004; Quantifying probabilities of volcanic events: the example of volcanic hazard at Mt. Vesuvius, J. Geophys. Res., vol. 109, B11201, doi:10.1029/2004JB003155), and it includes the fuzzy logic to minimize the effects of the choice of some particular thresholds of the model. In brief, BET-VH is based on a Bayesian approach applied to an Event Tree scheme that produces the probability estimation of any possible event in which we are interested, using all available information including theoretical models, historical and geological data, and monitoring observations. The general sequence is to estimate an a priori probability distribution based upon theoretical knowledge, to modify that using data. The procedure deals with epistemic and aleatory uncertainties in a formal way, through the estimation of probability distributions at each node of the Event Tree. In order to illustrate the potentiality of BET-VH in managing emergencies and in land use planning, we present applications of the code to some explosive volcanoes.
Zhou, Ji; Fu, Bojie; Gao, Guangyao; Lü, Yihe; Wang, Shuai
2017-03-01
The stochasticity of soil erosion reflects the variability of soil hydrological response to precipitation in a complex environment. Assessing this stochasticity is important for the conservation of soil and water resources; however, the stochasticity of erosion event in restoration vegetation types in water-limited environment has been little investigated. In this study, we constructed an event-driven framework to quantify the stochasticity of runoff and sediment generation in three typical restoration vegetation types (Armeniaca sibirica (T1), Spiraea pubescens (T2) and Artemisia copria (T3)) in closed runoff plots over five rainy seasons in the Loess Plateau of China. The results indicate that, under the same rainfall condition, the average probabilities of runoff and sediment in T1 (3.8 and 1.6 %) and T3 (5.6 and 4.4 %) were lowest and highest, respectively. The binomial and Poisson probabilistic model are two effective ways to simulate the frequency distributions of times of erosion events occurring in all restoration vegetation types. The Bayes model indicated that relatively longer-duration and stronger-intensity rainfall events respectively become the main probabilistic contributors to the stochasticity of an erosion event occurring in T1 and T3. Logistic regression modelling highlighted that the higher-grade rainfall intensity and canopy structure were the two most important factors to respectively improve and restrain the probability of stochastic erosion generation in all restoration vegetation types. The Bayes, binomial, Poisson and logistic regression models constituted an integrated probabilistic assessment to systematically simulate and evaluate soil erosion stochasticity. This should prove to be an innovative and important complement in understanding soil erosion from the stochasticity viewpoint, and also provide an alternative to assess the efficacy of ecological restoration in conserving soil and water resources in a semi-arid environment.
Externally Verifiable Oblivious RAM
Directory of Open Access Journals (Sweden)
Gancher Joshua
2017-04-01
Full Text Available We present the idea of externally verifiable oblivious RAM (ORAM. Our goal is to allow a client and server carrying out an ORAM protocol to have disputes adjudicated by a third party, allowing for the enforcement of penalties against an unreliable or malicious server. We give a security definition that guarantees protection not only against a malicious server but also against a client making false accusations. We then give modifications of the Path ORAM [15] and Ring ORAM [9] protocols that meet this security definition. These protocols both have the same asymptotic runtimes as the semi-honest original versions and require the external verifier to be involved only when the client or server deviates from the protocol. Finally, we implement externally verified ORAM, along with an automated cryptocurrency contract to use as the external verifier.
... Esophageal Cancer Treatment Head and Neck Cancer Treatment Lung Cancer Treatment Prostate Cancer Treatment Brain Tumor Treatment Why is ... Radiation Oncology) Breast Cancer Treatment Esophageal Cancer Treatment Lung Cancer Treatment Images related to External Beam Therapy (EBT) Sponsored ...
Massoud Moghaddam
1993-01-01
Two case reports of malignant external otitis in the elderly diabetics and their complications and management with regard to our experience at Amir Alam Hospital, Department of ENT will be discussed here.
Checklists for external validity
DEFF Research Database (Denmark)
Dyrvig, Anne-Kirstine; Kidholm, Kristian; Gerke, Oke;
2014-01-01
RATIONALE, AIMS AND OBJECTIVES: The quality of the current literature on external validity varies considerably. An improved checklist with validated items on external validity would aid decision-makers in judging similarities among circumstances when transferring evidence from a study setting...... to an implementation setting. In this paper, currently available checklists on external validity are identified, assessed and used as a basis for proposing a new improved instrument. METHOD: A systematic literature review was carried out in Pubmed, Embase and Cinahl on English-language papers without time restrictions....... The retrieved checklist items were assessed for (i) the methodology used in primary literature, justifying inclusion of each item; and (ii) the number of times each item appeared in checklists. RESULTS: Fifteen papers were identified, presenting a total of 21 checklists for external validity, yielding a total...
Novotná, Michaela
2008-01-01
This study aims to analyze event-marketing activities of the small firm and propose new events. At first the theoretical part describes marketing and communication mix and then especially planning and development of event marketing campaign. Research data were collected by the method of survey to propose the new events. Randomly selected customers were asked to fill the questionnaire. Its results were integrated into the proposal of the new events. The interview was realized with the owner of...
Migration with fiscal externalities.
Hercowitz, Z; Pines, D
1991-11-01
"This paper analyses the distribution of a country's population among regions when migration involves fiscal externalities. The main question addressed is whether a decentralized decision making [by] regional governments can produce an optimal population distribution...or a centralized intervention is indispensable, as argued before in the literature.... It turns out that, while with costless mobility the fiscal externality is fully internalized by voluntary interregional transfers, with costly mobility, centrally coordinated transfers still remain indispensable for achieving the socially optimal allocation."
Piggins, Ashley; Salerno, Gillian
2016-01-01
It has long been understood that externalities of some kind are responsible for Sen’s (1970) theorem on the impossibility of a Paretian liberal. However, Saari and Petron (2006) show that for any social preference cycle generated by combining the weak Pareto principle and individual decisiveness, every decisive individual must suffer at least one strong negative externality. We show that this fundamental result only holds when individual preferences are strict. Building on their contribution,...
Predicting coastal cliff erosion using a Bayesian probabilistic model
Hapke, C.; Plant, N.
2010-01-01
Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70-90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale. ?? 2010.
Risk Management Technologies With Logic and Probabilistic Models
Solozhentsev, E D
2012-01-01
This book presents intellectual, innovative, information technologies (I3-technologies) based on logical and probabilistic (LP) risk models. The technologies presented here consider such models for structurally complex systems and processes with logical links and with random events in economics and technology. The volume describes the following components of risk management technologies: LP-calculus; classes of LP-models of risk and efficiency; procedures for different classes; special software for different classes; examples of applications; methods for the estimation of probabilities of events based on expert information. Also described are a variety of training courses in these topics. The classes of risk models treated here are: LP-modeling, LP-classification, LP-efficiency, and LP-forecasting. Particular attention is paid to LP-models of risk of failure to resolve difficult economic and technical problems. Amongst the discussed procedures of I3-technologies are the construction of LP-models,...
Advanced neutron source reactor probabilistic flow blockage assessment
Energy Technology Data Exchange (ETDEWEB)
Ramsey, C.T.
1995-08-01
The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool.
Probabilistic methods for identification of significant accident sequences in loop-type LMFBRs
Energy Technology Data Exchange (ETDEWEB)
Jamali, K M.A.
1983-06-01
The aim of the Probabilistic Accident Progression Analysis (PAPA) described herein is to establish a framework for better use of the probability measure; first, as a basis for deterministic calculations, and second, as a part of a comprehensive method for risk assessment in its own right. The achievement of this goal rests on: (1) improvements in the existing approaches for acquisition and analysis of accident sequences; (2) defining a new measure of probabilistic importance that aids in the ranking of sequences of highly uncertain events; and (3) implementation of new techniques for quantification of dependent failures of similar components. The existing techniques related to the above three topics are discussed and the state of the art is reviewed. The PAPA approach is described. The techniques of PAPA are applied to a class of Protected Transients (transients in which the reactor is successfully shutdown) in the Clinch River Breeder Reactor Plant (CRBRP). The results of the application of these techniques are described.
Directory of Open Access Journals (Sweden)
Liran Carmel
2010-01-01
Full Text Available Evolutionary binary characters are features of species or genes, indicating the absence (value zero or presence (value one of some property. Examples include eukaryotic gene architecture (the presence or absence of an intron in a particular locus, gene content, and morphological characters. In many studies, the acquisition of such binary characters is assumed to represent a rare evolutionary event, and consequently, their evolution is analyzed using various flavors of parsimony. However, when gain and loss of the character are not rare enough, a probabilistic analysis becomes essential. Here, we present a comprehensive probabilistic model to describe the evolution of binary characters on a bifurcating phylogenetic tree. A fast software tool, EREM, is provided, using maximum likelihood to estimate the parameters of the model and to reconstruct ancestral states (presence and absence in internal nodes and events (gain and loss events along branches.
The Decidability Frontier for Probabilistic Automata on Infinite Words
Chatterjee, Krishnendu; Tracol, Mathieu
2011-01-01
We consider probabilistic automata on infinite words with acceptance defined by safety, reachability, B\\"uchi, coB\\"uchi, and limit-average conditions. We consider quantitative and qualitative decision problems. We present extensions and adaptations of proofs for probabilistic finite automata and present a complete characterization of the decidability and undecidability frontier of the quantitative and qualitative decision problems for probabilistic automata on infinite words.
Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws
Energy Technology Data Exchange (ETDEWEB)
Spencer, Benjamin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Backman, Marie [Univ. of Tennessee, Knoxville, TN (United States); Williams, Paul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoffman, William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dickson, Terry [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bass, B. Richard [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Klasky, Hilda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-09-01
This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decision making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.
Initial Probabilistic Evaluation of Reactor Pressure Vessel Fracture with Grizzly and Raven
Energy Technology Data Exchange (ETDEWEB)
Spencer, Benjamin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hoffman, William [Univ. of Idaho, Moscow, ID (United States); Sen, Sonat [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dickson, Terry [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bass, Richard [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2015-10-01
The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled reactor pressure vessels (RPVs). Grizzly can be used to model the thermal/mechanical response of an RPV under transient conditions that would be observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtain stress intensity factors, which can in turn be used to assess whether a fracture would initiate at a pre-existing flaw. These capabilities have been demonstrated previously. A typical RPV is likely to contain a large population of pre-existing flaws introduced during the manufacturing process. This flaw population is characterized stastistically through probability density functions of the flaw distributions. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation during a transient event. This report documents initial work to perform probabilistic analysis of RPV fracture during a PTS event using a combination of the RAVEN risk analysis code and Grizzly. This work is limited in scope, considering only a single flaw with deterministic geometry, but with uncertainty introduced in the parameters that influence fracture toughness. These results are benchmarked against equivalent models run in the FAVOR code. When fully developed, the RAVEN/Grizzly methodology for modeling probabilistic fracture in RPVs will provide a general capability that can be used to consider a wider variety of vessel and flaw conditions that are difficult to consider with current tools. In addition, this will provide access to advanced probabilistic techniques provided by RAVEN, including adaptive sampling and parallelism, which can dramatically
Energy Technology Data Exchange (ETDEWEB)
Sperbeck, Silvio; Strack, Christian; Thuma, Gernot
2013-11-15
The aim of the analyses on natural hazards described in this report was to evaluate the advantages of innovative hazard assessment methods available today over the hazard assessment methods commonly applied for German nuclear power plant sites in the past. For each hazard under consideration (earthquake, flooding, and wind loads) it has been assessed whether the new methods provide additional insights that could call for their mandatory application in future site specific hazard assessments. If no additional insights are gained, the hitherto applied methods can be considered adequate according to today's standards. In the context of this work, no areas could be identified where the hazard assessment methods stipulated in German (nuclear) regulations are generally inadequate. These methods that are commonly applied in practice do not seem to be prone to significantly underestimate the site specific hazard. Nevertheless, some newer methods allow for more precise (reduction of uncertainties) and more comprehensive (consideration of additional hazard characteristics) hazard assessments. Therefore, depending on the hazard under consideration, it could be advisable to supplement future site specific hazard assessments by some additional analyses. As the methods for some of these additional analyses are not yet fully developed, further research will be necessary to enable these amendments.
Development of a Probabilistic Flood Hazard Assessment (PFHA) for the nuclear safety
Ben Daoued, Amine; Guimier, Laurent; Hamdi, Yasser; Duluc, Claire-Marie; Rebour, Vincent
2016-04-01
The purpose of this study is to lay the basis for a probabilistic evaluation of flood hazard (PFHA). Probabilistic assessment of external floods has become a current topic of interest to the nuclear scientific community. Probabilistic approaches complement deterministic approaches by exploring a set of scenarios and associating a probability to each of them. These approaches aim to identify all possible failure scenarios, combining their probability, in order to cover all possible sources of risk. They are based on the distributions of initiators and/or the variables caracterizing these initiators. The PFHA can characterize the water level for example at defined point of interest in the nuclear site. This probabilistic flood hazard characterization takes into account all the phenomena that can contribute to the flooding of the site. The main steps of the PFHA are: i) identification of flooding phenomena (rains, sea water level, etc.) and screening of relevant phenomena to the nuclear site, ii) identification and probabilization of parameters associated to selected flooding phenomena, iii) spreading of the probabilized parameters from the source to the point of interest in the site, v) obtaining hazard curves and aggregation of flooding phenomena contributions at the point of interest taking into account the uncertainties. Within this framework, the methodology of the PFHA has been developed for several flooding phenomena (rain and/or sea water level, etc.) and then implemented and tested with a simplified case study. In the same logic, our study is still in progress to take into account other flooding phenomena and to carry out more case studies.
Probabilistic logic networks a comprehensive framework for uncertain inference
Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari
2008-01-01
This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.
Probabilistic structural analysis algorithm development for computational efficiency
Wu, Y.-T.
1991-01-01
The PSAM (Probabilistic Structural Analysis Methods) program is developing a probabilistic structural risk assessment capability for the SSME components. An advanced probabilistic structural analysis software system, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), is being developed as part of the PSAM effort to accurately simulate stochastic structures operating under severe random loading conditions. One of the challenges in developing the NESSUS system is the development of the probabilistic algorithms that provide both efficiency and accuracy. The main probability algorithms developed and implemented in the NESSUS system are efficient, but approximate in nature. In the last six years, the algorithms have improved very significantly.
Directory of Open Access Journals (Sweden)
Friess Helmut
2009-05-01
Full Text Available Abstract Background The 5-year survival of patients with resected pancreatic adenocarcinoma is still unsatisfying. The ESPAC-1 and the CONKO 001 trial proofed that adjuvant chemotherapy improves 5-year survival significantly from approximately 14% to 21%. In parallel, investigators from the Virginia Mason Clinic reported a 5-year survival rate of 55% in a phase II trial evaluating a combination of adjuvant chemotherapy, immunotherapy and external beam radiation (CapRI-scheme. Two other groups confirmed in phase II trials these results to a certain extent. However, these groups reported severe gastrointestinal toxicity (up to 93% grade 3 or 4 toxicity. In a randomized controlled phase III trial, called CapRI, 110 patients were enrolled from 2004 to 2007 in Germany and Italy to check for reproducibility. Interestingly, much less gastrointestinal toxicity was observed. However, dose-reduction due to haematological side effects had to be performed in nearly all patients. First clinical results are expected for the end of 2009. Methods/Design CapRI-2 is an open, controlled, prospective, randomized, multicentre phase II trial with three parallel arms. A de-escalation of the CapRI-scheme will be tested in two different modifications. Patients in study arm A will be treated as outpatients with the complete CapRI-scheme consisting of cisplatin, Interferon alpha-2b and external beam radiation and three cycles of 5-fluorouracil continuous infusion. In study arm B the first de-escalation will be realised by omitting cisplatin. Next, patients in study arm C will additionally not receive external beam radiation. A total of 135 patients with pathologically confirmed R0 or R1 resected pancreatic adenocarcinoma are planned to be enrolled. Primary endpoint is the comparison of the treatment groups with respect to six-month event-free-survival. An event is defined as grade 3 or grade 4 toxicity, objective tumour recurrence, or death. Discussion The aim of this
A framework for probabilistic pluvial flood nowcasting for urban areas
Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick
2016-04-01
Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the
Hoechner, Andreas; Babeyko, Andrey Y.; Zamora, Natalia
2016-06-01
Despite having been rather seismically quiescent for the last decades, the Makran subduction zone is capable of hosting destructive earthquakes and tsunami. In particular, the well-known thrust event in 1945 (Balochistan earthquake) led to about 4000 casualties. Nowadays, the coastal regions are more densely populated and vulnerable to similar events. Furthermore, some recent publications discuss rare but significantly larger events at the Makran subduction zone as possible scenarios. We analyze the instrumental and historical seismicity at the subduction plate interface and generate various synthetic earthquake catalogs spanning 300 000 years with varying magnitude-frequency relations. For every event in the catalogs we compute estimated tsunami heights and present the resulting tsunami hazard along the coasts of Pakistan, Iran and Oman in the form of probabilistic tsunami hazard curves. We show how the hazard results depend on variation of the Gutenberg-Richter parameters and especially maximum magnitude assumption.
77 FR 37879 - Cooperative Patent Classification External User Day
2012-06-25
... Patent and Trademark Office Cooperative Patent Classification External User Day AGENCY: United States Patent and Trademark Office, Commerce. ACTION: Notice. SUMMARY: The United States Patent and Trademark Office (USPTO) is hosting a Cooperative Patent Classification (CPC) External User Day event at...
Environmental external effects from wind power based on the EU ExternE methodology
DEFF Research Database (Denmark)
Ibsen, Liselotte Schleisner; Nielsen, Per Sieverts
1998-01-01
The European Commission has launched a major study project, ExternE, to develop a methodology to quantify externalities. A “National Implementation Phase”, was started under the Joule II programme with the purpose of implementing the ExternE methodology in all member states. The main objective...
Probabilistic risk assessment of aircraft impact on a spent nuclear fuel dry storage
Energy Technology Data Exchange (ETDEWEB)
Almomani, Belal, E-mail: balmomani@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Sanghoon, E-mail: shlee1222@kmu.ac.kr [Department of Mechanical and Automotive Engineering, Keimyung University, Dalgubeol-daero 1095, Dalseo-gu, Daegu (Korea, Republic of); Jang, Dongchan, E-mail: dongchan.jang@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Kang, Hyun Gook, E-mail: kangh6@rpi.edu [Department of Mechanical, Aerospace and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy, NY 12180 (United States)
2017-01-15
Highlights: • A new risk assessment frame is proposed for aircraft impact into an interim dry storage. • It uses event tree analysis, response-structural analysis, consequence analysis, and Monte Carlo simulation. • A case study of the proposed procedure is presented to illustrate the methodology’s application. - Abstract: This paper proposes a systematic risk evaluation framework for one of the most significant impact events on an interim dry storage facility, an aircraft crash, by using a probabilistic approach. A realistic case study that includes a specific cask model and selected impact conditions is performed to demonstrate the practical applicability of the proposed framework. An event tree analysis of an occurred aircraft crash that defines a set of impact conditions and storage cask response is constructed. The Monte-Carlo simulation is employed for the probabilistic approach in consideration of sources of uncertainty associated with the impact loads onto the internal storage casks. The parameters for representing uncertainties that are managed probabilistically include the aircraft impact velocity, the compressive strength of the reinforced concrete wall, the missile shape factor, and the facility wall thickness. Failure probabilities of the impacted wall and a single storage cask under direct mechanical impact load caused by the aircraft crash are estimated. A finite element analysis is applied to simulate the postulated direct engine impact load onto the cask body, and a source term analysis for associated releases of radioactive materials as well as an off-site consequence analysis are performed. Finally, conditional risk contribution calculations are represented by an event tree model. Case study results indicate that no severe risk is presented, as the radiological consequences do not exceed regulatory exposure limits to the public. This risk model can be used with any other representative detailed parameters and reference design concepts for
A probabilistic tsunami hazard assessment for Indonesia
Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.
2014-11-01
Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.
A comparison of confluence and ample sets in probabilistic and non-probabilistic branching time
Hansen, Henri; Timmer, Mark; Massink, M.; Norman, G.; Wiklicky, H.
2014-01-01
Confluence reduction and partial order reduction by means of ample sets are two different techniques for state space reduction in both traditional and probabilistic model checking. This paper provides an extensive comparison between these two methods, and answers the question how they relate in term
Hansen, Henri; Timmer, Mark
2012-01-01
Confluence reduction and partial order reduction by means of ample sets are two different techniques for state space reduction in both traditional and probabilistic model checking. This presentation provides an extensive comparison between these two methods, answering the long-standing question of h
Maximum confidence measurements via probabilistic quantum cloning
Institute of Scientific and Technical Information of China (English)
Zhang Wen-Hai; Yu Long-Bao; Cao Zhuo-Liang; Ye Liu
2013-01-01
Probabilistic quantum cloning (PQC) cannot copy a set of linearly dependent quantum states.In this paper,we show that if incorrect copies are allowed to be produced,linearly dependent quantum states may also be cloned by the PQC.By exploiting this kind of PQC to clone a special set of three linearly dependent quantum states,we derive the upper bound of the maximum confidence measure of a set.An explicit transformation of the maximum confidence measure is presented.
Probabilistic analysis of a thermosetting pultrusion process
DEFF Research Database (Denmark)
Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri
2016-01-01
process. A new application for the probabilistic analysis of the pultrusion process is introduced using the response surface method (RSM). The results obtained from the RSM are validated by employing the Monte Carlo simulation (MCS) with Latin hypercube sampling technique. According to the results......In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion...
Quantum correlations support probabilistic pure state cloning
Energy Technology Data Exchange (ETDEWEB)
Roa, Luis, E-mail: lroa@udec.cl [Departamento de Física, Universidad de Concepción, Casilla 160-C, Concepción (Chile); Alid-Vaccarezza, M.; Jara-Figueroa, C. [Departamento de Física, Universidad de Concepción, Casilla 160-C, Concepción (Chile); Klimov, A.B. [Departamento de Física, Universidad de Guadalajara, Avenida Revolución 1500, 44420 Guadalajara, Jalisco (Mexico)
2014-02-01
The probabilistic scheme for making two copies of two nonorthogonal pure states requires two auxiliary systems, one for copying and one for attempting to project onto the suitable subspace. The process is performed by means of a unitary-reduction scheme which allows having a success probability of cloning different from zero. The scheme becomes optimal when the probability of success is maximized. In this case, a bipartite state remains as a free degree which does not affect the probability. We find bipartite states for which the unitarity does not introduce entanglement, but does introduce quantum discord between some involved subsystems.
Probabilistic Analysis of the Quality Calculus
DEFF Research Database (Denmark)
Nielson, Hanne Riis; Nielson, Flemming
2013-01-01
We consider a fragment of the Quality Calculus, previously introduced for defensive programming of software components such that it becomes natural to plan for default behaviour in case the ideal behaviour fails due to unreliable communication. This paper develops a probabilistically based trust...... analysis supporting the Quality Calculus. It uses information about the probabilities that expected input will be absent in order to determine the trustworthiness of the data used for controlling the distributed system; the main challenge is to take accord of the stochastic dependency between some...
Signature recognition using neural network probabilistic
Directory of Open Access Journals (Sweden)
Heri Nurdiyanto
2016-03-01
Full Text Available The signature of each person is different and has unique characteristics. Thus, this paper discusses the development of a personal identification system based on it is unique digital signature. The process of preprocessing used gray scale method, while Shannon Entropy and Probabilistic Neural Network are used respectively for feature extraction and identification. This study uses five signature types with five signatures in every type. While the test results compared to actual data compared to real data, the proposed system performance was only 40%.
Probabilistic results for a mobile service scenario
DEFF Research Database (Denmark)
Møller, Jesper; Yiu, Man Lung
We consider the following stochastic model for a mobile service scenario. Consider a stationary Poisson process in Rd, with its points radially ordered with respect to the origin (the anchor); if d = 2, the points may correspond to locations of e.g. restaurants. A user, with a location different...... the inferred privacy region is a random set obtained by an adversary who only knows the anchor and the points received from the server, where the adversary ‘does the best' to infer the possible locations of the user. Probabilistic results related to the communication cost and the inferred privacy region...
Domain Knowledge Uncertainty and Probabilistic Parameter Constraints
Mao, Yi
2012-01-01
Incorporating domain knowledge into the modeling process is an effective way to improve learning accuracy. However, as it is provided by humans, domain knowledge can only be specified with some degree of uncertainty. We propose to explicitly model such uncertainty through probabilistic constraints over the parameter space. In contrast to hard parameter constraints, our approach is effective also when the domain knowledge is inaccurate and generally results in superior modeling accuracy. We focus on generative and conditional modeling where the parameters are assigned a Dirichlet or Gaussian prior and demonstrate the framework with experiments on both synthetic and real-world data.
Probabilistic Recovery Guarantees for Sparsely Corrupted Signals
Pope, Graeme; Studer, Christoph
2012-01-01
We consider the recovery of sparse signals subject to sparse interference, as introduced in Studer et al., IEEE Trans. IT, 2012. We present novel probabilistic recovery guarantees for this framework, covering varying degrees of knowledge of the signal and interference support, which are relevant for a large number of practical applications. Our results assume that the sparsifying dictionaries are solely characterized by coherence parameters and we require randomness only in the signal and/or interference. The obtained recovery guarantees show that one can recover sparsely corrupted signals with overwhelming probability, even if the sparsity of both the signal and interference scale (near) linearly with the number of measurements.
Probabilistic double guarantee kidnapping detection in SLAM.
Tian, Yang; Ma, Shugen
2016-01-01
For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.
Probabilistic Design of Offshore Structural Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard
1988-01-01
Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi...
Probabilistic modeling of solar power systems
Safie, Fayssal M.
1989-01-01
The author presents a probabilistic approach based on Markov chain theory to model stand-alone photovoltaic power systems and predict their long-term service performance. The major advantage of this approach is that it allows designers and developers of these systems to analyze the system performance as well as the battery subsystem performance in the long run and determine the system design requirements that meet a specified service performance level. The methodology presented is illustrated by using data for a radio repeater system for the Boston, Massachusetts, location.
Probabilistic remote state preparation by W states
Institute of Scientific and Technical Information of China (English)
Liu Jin-Ming; Wang Yu-Zhu
2004-01-01
In this paper we consider a scheme for probabilistic remote state preparation of a general qubit by using W states. The scheme consists of the sender, Alice, and two remote receivers Bob and Carol. Alice performs a projective measurement on her qubit in the basis spanned by the state she wants to prepare and its orthocomplement. This allows either Bob or Carol to reconstruct the state with finite success probability. It is shown that for some special ensembles of qubits, the remote state preparation scheme requires only two classical bits, unlike the case in the scheme of quantum teleportation where three classical bits are needed.
Externality or sustainability economics?
Energy Technology Data Exchange (ETDEWEB)
Bergh, Jeroen C.J.M. van den [ICREA, Barcelona (Spain); Department of Economics and Economic History and Institute for Environmental Science and Technology, Universitat Autonoma de Barcelona (Spain)
2010-09-15
In an effort to develop 'sustainability economics' Baumgaertner and Quaas (2010) neglect the central concept of environmental economics-'environmental externality'. This note proposes a possible connection between the concepts of environmental externality and sustainability. In addition, attention is asked for other aspects of 'sustainability economics', namely the distinction weak/strong sustainability, spatial sustainability and sustainable trade, distinctive sustainability policy, and the ideas of early 'sustainability economists'. I argue that both sustainability and externalities reflect a systems perspective and propose that effective sustainability solutions require that more attention is given to system feedbacks, notably other-regarding preferences and social interactions, and energy and environmental rebound. The case of climate change and policy is used to illustrate particular statements. As a conclusion, a list of 20 insights and suggestions for research is offered. (author)
Metasurface external cavity laser
Energy Technology Data Exchange (ETDEWEB)
Xu, Luyao, E-mail: luyaoxu.ee@ucla.edu; Curwen, Christopher A.; Williams, Benjamin S. [Department of Electrical Engineering, University of California, Los Angeles, California 90095 (United States); California NanoSystems Institute, University of California, Los Angeles, California 90095 (United States); Hon, Philip W. C.; Itoh, Tatsuo [Department of Electrical Engineering, University of California, Los Angeles, California 90095 (United States); Chen, Qi-Sheng [Northrop Grumman Aerospace Systems, Redondo Beach, California 90278 (United States)
2015-11-30
A vertical-external-cavity surface-emitting-laser is demonstrated in the terahertz range, which is based upon an amplifying metasurface reflector composed of a sub-wavelength array of antenna-coupled quantum-cascade sub-cavities. Lasing is possible when the metasurface reflector is placed into a low-loss external cavity such that the external cavity—not the sub-cavities—determines the beam properties. A near-Gaussian beam of 4.3° × 5.1° divergence is observed and an output power level >5 mW is achieved. The polarized response of the metasurface allows the use of a wire-grid polarizer as an output coupler that is continuously tunable.
Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?
Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.
2017-03-01
Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;
Blind RRT: A probabilistically complete distributed RRT
Rodriguez, Cesar
2013-11-01
Rapidly-Exploring Random Trees (RRTs) have been successful at finding feasible solutions for many types of problems. With motion planning becoming more computationally demanding, we turn to parallel motion planning for efficient solutions. Existing work on distributed RRTs has been limited by the overhead that global communication requires. A recent approach, Radial RRT, demonstrated a scalable algorithm that subdivides the space into regions to increase the computation locality. However, if an obstacle completely blocks RRT growth in a region, the planning space is not covered and is thus not probabilistically complete. We present a new algorithm, Blind RRT, which ignores obstacles during initial growth to efficiently explore the entire space. Because obstacles are ignored, free components of the tree become disconnected and fragmented. Blind RRT merges parts of the tree that have become disconnected from the root. We show how this algorithm can be applied to the Radial RRT framework allowing both scalability and effectiveness in motion planning. This method is a probabilistically complete approach to parallel RRTs. We show that our method not only scales but also overcomes the motion planning limitations that Radial RRT has in a series of difficult motion planning tasks. © 2013 IEEE.
Probabilistic Route Selection Algorithm for IP Traceback
Yim, Hong-Bin; Jung, Jae-Il
DoS(Denial of Service) or DDoS(Distributed DoS) attack is a major threaten and the most difficult problem to solve among many attacks. Moreover, it is very difficult to find a real origin of attackers because DoS/DDoS attacker uses spoofed IP addresses. To solve this problem, we propose a probabilistic route selection traceback algorithm, namely PRST, to trace the attacker's real origin. This algorithm uses two types of packets such as an agent packet and a reply agent packet. The agent packet is in use to find the attacker's real origin and the reply agent packet is in use to notify to a victim that the agent packet is reached the edge router of the attacker. After attacks occur, the victim generates the agent packet and sends it to a victim's edge router. The attacker's edge router received the agent packet generates the reply agent packet and send it to the victim. The agent packet and the reply agent packet is forwarded refer to probabilistic packet forwarding table (PPFT) by routers. The PRST algorithm runs on the distributed routers and PPFT is stored and managed by routers. We validate PRST algorithm by using mathematical approach based on Poisson distribution.
Operational General Relativity: Possibilistic, Probabilistic, and Quantum
Hardy, Lucien
2016-01-01
In this paper we develop an operational formulation of General Relativity similar in spirit to existing operational formulations of Quantum Theory. To do this we introduce an operational space (or op-space) built out of scalar fields. A point in op-space corresponds to some nominated set of scalar fields taking some given values in coincidence. We assert that op-space is the space in which we observe the world. We introduce also a notion of agency (this corresponds to the ability to set knob settings just like in Operational Quantum Theory). The effects of agents' actions should only be felt to the future so we introduce also a time direction field. Agency and time direction can be understood as effective notions. We show how to formulate General Relativity as a possibilistic theory and as a probabilistic theory. In the possibilistic case we provide a compositional framework for calculating whether some operationally described situation is possible or not. In the probabilistic version we introduce probabiliti...
Probabilistic Transcriptome Assembly and Variant Graph Genotyping
DEFF Research Database (Denmark)
Sibbesen, Jonas Andreas
The introduction of second-generation sequencing, has in recent years allowed the biological community to determine the genomes and transcriptomes of organisms and individuals at an unprecedented rate. However, almost every step in the sequencing protocol introduces uncertainties in how the resul......The introduction of second-generation sequencing, has in recent years allowed the biological community to determine the genomes and transcriptomes of organisms and individuals at an unprecedented rate. However, almost every step in the sequencing protocol introduces uncertainties in how...... the resulting sequencing data should be interpreted. This has over the years spurred the development of many probabilistic methods that are capable of modelling dierent aspects of the sequencing process. Here, I present two of such methods that were developed to each tackle a dierent problem in bioinformatics...... that this approach outperforms existing state-of-the-art methods measured using sensitivity and precision on both simulated and real data. The second is a novel probabilistic method that uses exact alignment of k-mers to a set of variants graphs to provide unbiased estimates of genotypes in a population...
A probabilistic bridge safety evaluation against floods.
Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho
2016-01-01
To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.
Probabilistic Load Flow Considering Wind Generation Uncertainty
Directory of Open Access Journals (Sweden)
R. Ramezani
2011-10-01
Full Text Available Renewable energy sources, such as wind, solar and hydro, are increasingly incorporated into power grids, as a direct consequence of energy and environmental issues. These types of energies are variable and intermittent by nature and their exploitation introduces uncertainties into the power grid. Therefore, probabilistic analysis of the system performance is of significant interest. This paper describes a new approach to Probabilistic Load Flow (PLF by modifying the Two Point Estimation Method (2PEM to cover some drawbacks of other currently used methods. The proposed method is examined using two case studies, the IEEE 9-bus and the IEEE 57-bus test systems. In order to justify the effectiveness of the method, numerical comparison with Monte Carlo Simulation (MCS method is presented. Simulation results indicate that the proposed method significantly reduces the computational burden while maintaining a high level of accuracy. Moreover, that the unsymmetrical 2PEM has a higher level of accuracy than the symmetrical 2PEM with equal computing burden, when the Probability Density Function (PDF of uncertain variables is asymmetric.
Probabilistic adaptation in changing microbial environments
Directory of Open Access Journals (Sweden)
Yarden Katz
2016-12-01
Full Text Available Microbes growing in animal host environments face fluctuations that have elements of both randomness and predictability. In the mammalian gut, fluctuations in nutrient levels and other physiological parameters are structured by the host’s behavior, diet, health and microbiota composition. Microbial cells that can anticipate environmental fluctuations by exploiting this structure would likely gain a fitness advantage (by adapting their internal state in advance. We propose that the problem of adaptive growth in structured changing environments, such as the gut, can be viewed as probabilistic inference. We analyze environments that are “meta-changing”: where there are changes in the way the environment fluctuates, governed by a mechanism unobservable to cells. We develop a dynamic Bayesian model of these environments and show that a real-time inference algorithm (particle filtering for this model can be used as a microbial growth strategy implementable in molecular circuits. The growth strategy suggested by our model outperforms heuristic strategies, and points to a class of algorithms that could support real-time probabilistic inference in natural or synthetic cellular circuits.
Probabilistic Sequence Learning in Mild Cognitive Impairment
Directory of Open Access Journals (Sweden)
Dezso eNemeth
2013-07-01
Full Text Available Mild Cognitive Impairment (MCI causes slight but noticeable disruption in cognitive systems, primarily executive and memory functions. However, it is not clear if the development of sequence learning is affected by an impaired cognitive system and, if so, how. The goal of our study was to investigate the development of probabilistic sequence learning, from the initial acquisition to consolidation, in MCI and healthy elderly control groups. We used the Alternating Serial Reaction Time task (ASRT to measure probabilistic sequence learning. Individuals with MCI showed weaker learning performance than the healthy elderly group. However, using the reaction times only from the second half of each learning block – after the reactivation phase - we found intact learning in MCI. Based on the assumption that the first part of each learning block is related to reactivation/recall processes, we suggest that these processes are affected in MCI. The 24-hour offline period showed no effect on sequence-specific learning in either group but did on general skill learning: the healthy elderly group showed offline improvement in general reaction times while individuals with MCI did not. Our findings deepen our understanding regarding the underlying mechanisms and time course of sequence acquisition and consolidation.
Recent advances in probabilistic species pool delineations
Directory of Open Access Journals (Sweden)
Dirk Nikolaus Karger
2016-07-01
Full Text Available A species pool is the set of species that could potentially colonize and establish within a community. It has been a commonly used concept in biogeography since the early days of MacArthur and Wilson’s work on Island Biogeography. Despite their simple and appealing definition, an operational application of species pools is bundled with a multitude of problems, which have often resulted in arbitrary decisions and workarounds when defining species pools. Two recently published papers address the operational problems of species pool delineations, and show ways of delineating them in a probabilistic fashion. In both papers, species pools were delineated using a process-based, mechanistical approach, which opens the door for a multitude of new applications in biogeography. Such applications include detecting the hidden signature of biotic interactions, disentangling the geographical structure of community assembly processes, and incorporating a temporal extent into species pools. Although similar in their conclusions, both ‘probabilistic approaches’ differ in their implementation and definitions. Here I give a brief overview of the differences and similarities of both approaches, and identify the challenges and advantages in their application.
Modelling structured data with Probabilistic Graphical Models
Forbes, F.
2016-05-01
Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.
Damage identification with probabilistic neural networks
Energy Technology Data Exchange (ETDEWEB)
Klenke, S.E.; Paez, T.L.
1995-12-01
This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework, it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.
Probabilistic Principal Component Analysis for Metabolomic Data.
LENUS (Irish Health Repository)
Nyamundanda, Gift
2010-11-23
Abstract Background Data from metabolomic studies are typically complex and high-dimensional. Principal component analysis (PCA) is currently the most widely used statistical technique for analyzing metabolomic data. However, PCA is limited by the fact that it is not based on a statistical model. Results Here, probabilistic principal component analysis (PPCA) which addresses some of the limitations of PCA, is reviewed and extended. A novel extension of PPCA, called probabilistic principal component and covariates analysis (PPCCA), is introduced which provides a flexible approach to jointly model metabolomic data and additional covariate information. The use of a mixture of PPCA models for discovering the number of inherent groups in metabolomic data is demonstrated. The jackknife technique is employed to construct confidence intervals for estimated model parameters throughout. The optimal number of principal components is determined through the use of the Bayesian Information Criterion model selection tool, which is modified to address the high dimensionality of the data. Conclusions The methods presented are illustrated through an application to metabolomic data sets. Jointly modeling metabolomic data and covariates was successfully achieved and has the potential to provide deeper insight to the underlying data structure. Examination of confidence intervals for the model parameters, such as loadings, allows for principled and clear interpretation of the underlying data structure. A software package called MetabolAnalyze, freely available through the R statistical software, has been developed to facilitate implementation of the presented methods in the metabolomics field.
Probabilistic Model for Fatigue Crack Growth in Welded Bridge Details
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, Thierry
2013-01-01
In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account. The bending stresses can either be introduced by e.g. misalignment or redistributio...
An ellipsoid algorithm for probabilistic robust controller design
Kanev, S.K.; de Schutter, B.; Verhaegen, M.H.G.
2003-01-01
In this paper, a new iterative approach to probabilistic robust controller design is presented, which is applicable to any robust controller/filter design problem that can be represented as an LMI feasibility problem. Recently, a probabilistic Subgradient Iteration algorithm was proposed for solving
Pre-Processing Rules for Triangulation of Probabilistic Networks
Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den
2003-01-01
The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a network’s graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique siz
Pre-processing for Triangulation of Probabilistic Networks
Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der
2001-01-01
The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique
The JCSS probabilistic model code: Experience and recent developments
Chryssanthopoulos, M.; Diamantidis, D.; Vrouwenvelder, A.C.W.M.
2003-01-01
The JCSS Probabilistic Model Code (JCSS-PMC) has been available for public use on the JCSS website (www.jcss.ethz.ch) for over two years. During this period, several examples have been worked out and new probabilistic models have been added. Since the engineering community has already been exposed t
Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic
Directory of Open Access Journals (Sweden)
Gannouni Asmae
2017-01-01
closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.
Probabilistic G-Metric space and some fixed point results
Directory of Open Access Journals (Sweden)
A. R. Janfada
2013-01-01
Full Text Available In this note we introduce the notions of generalized probabilistic metric spaces and generalized Menger probabilistic metric spaces. After making our elementary observations and proving some basic properties of these spaces, we are going to prove some fixed point result in these spaces.
The Role of Language in Building Probabilistic Thinking
Nacarato, Adair Mendes; Grando, Regina Célia
2014-01-01
This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…
Hybrid Probabilistic Logics: Theoretical Aspects, Algorithms and Experiments
Michels, S.
2016-01-01
Steffen Michels Hybrid Probabilistic Logics: Theoretical Aspects, Algorithms and Experiments Probabilistic logics aim at combining the properties of logic, that is they provide a structured way of expressing knowledge and a mechanical way of reasoning about such knowledge, with the ability of prob
Error Immune Logic for Low-Power Probabilistic Computing
Directory of Open Access Journals (Sweden)
Bo Marr
2010-01-01
design for the maximum amount of energy savings per a given error rate. Spice simulation results using a commercially available and well-tested 0.25 μm technology are given verifying the ultra-low power, probabilistic full-adder designs. Further, close to 6X energy savings is achieved for a probabilistic full-adder over the deterministic case.
Pre-processing for Triangulation of Probabilistic Networks
Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der
2001-01-01
The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique
Extension of contractive maps in the Menger probabilistic metric space
Energy Technology Data Exchange (ETDEWEB)
Razani, Abdolrahman [Department of Mathematics, Faculty of Science, Imam Khomeini International University, P.O. Box 34194-288 Qazvin (Iran, Islamic Republic of)]. E-mail: razani@ipm.ir; Fouladgar, Kaveh [Stanford University, Mathematics Building 380, 450 Serra Mall, Stanford, CA 94305-2125 (United States)]. E-mail: kfouladgar@yahoo.com
2007-12-15
In this article, the topological properties of the Menger probabilistic metric spaces and the mappings between these spaces are studied. In addition, contractive and k-contractive mappings are introduced. As an application, a new fixed point theorem in a chainable Menger probabilistic metric space is proved.
Understanding Probabilistic Thinking: The Legacy of Efraim Fischbein.
Greer, Brian
2001-01-01
Honors the contribution of Efraim Fischbein to the study and analysis of probabilistic thinking. Summarizes Fischbein's early work, then focuses on the role of intuition in mathematical and scientific thinking; the development of probabilistic thinking; and the influence of instruction on that development. (Author/MM)
Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor
2017-04-01
Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located
Stochastic Control - External Models
DEFF Research Database (Denmark)
Poulsen, Niels Kjølstad
2005-01-01
This note is devoted to control of stochastic systems described in discrete time. We are concerned with external descriptions or transfer function model, where we have a dynamic model for the input output relation only (i.e.. no direct internal information). The methods are based on LTI systems...
Productivity Change and Externalities
DEFF Research Database (Denmark)
Kravtsova, Victoria
2014-01-01
firms and the economy as a whole. The approach used in the current research accounts for different internal as well as external factors that individual firms face and evaluates the effect on changes in productivity, technology as well as the efficiency of domestic firms. The empirical analysis focuses...... change in different types of firms and sectors of the economy...
Multiple external root resorption.
Yusof, W Z; Ghazali, M N
1989-04-01
Presented is an unusual case of multiple external root resorption. Although the cause of this resorption was not determined, several possibilities are presented. Trauma from occlusion, periodontal and pulpal inflammation, and resorption of idiopathic origin are all discussed as possible causes.
Verification of a probabilistic flood forecasting system for an Alpine Region of northern Italy
Laiolo, P.; Gabellani, S.; Rebora, N.; Rudari, R.; Ferraris, L.; Ratto, S.; Stevenin, H.
2012-04-01
Probabilistic hydrometeorological forecasting chains are increasingly becoming an operational tool used by civil protection centres for issuing flood alerts. One of the most important requests of decision makers is to have reliable systems, for this reason an accurate verification of their predictive performances become essential. The aim of this work is to validate a probabilistic flood forecasting system: Flood-PROOFS. The system works in real time, since 2008, in an alpine Region of northern Italy, Valle d'Aosta. It is used by the Civil Protection regional service to issue warnings and by the local water company to protect its facilities. Flood-PROOFS uses as input Quantitative Precipitation Forecast (QPF) derived from the Italian limited area model meteorological forecast (COSMO-I7) and forecasts issued by regional expert meteorologists. Furthermore the system manages and uses both real time meteorological and satellite data and real time data on the maneuvers performed by the water company on dams and river devices. The main outputs produced by the computational chain are deterministic and probabilistic discharge forecasts in different cross sections of the considered river network. The validation of the flood prediction system has been conducted on a 25 months period considering different statistical methods such as Brier score, Rank histograms and verification scores. The results highlight good performances of the system as support system for emitting warnings but there is a lack of statistics especially for huge discharge events.
Energy Technology Data Exchange (ETDEWEB)
Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin
2015-06-28
Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.
Directory of Open Access Journals (Sweden)
AHMER ALI
2014-10-01
Full Text Available The probabilistic seismic performance of a standard Korean nuclear power plant (NPP with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.
A note on probabilistic models over strings: the linear algebra approach.
Bouchard-Côté, Alexandre
2013-12-01
Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.
Abstract Interpretation for Probabilistic Termination of Biological Systems
Gori, Roberta; 10.4204/EPTCS.11.9
2009-01-01
In a previous paper the authors applied the Abstract Interpretation approach for approximating the probabilistic semantics of biological systems, modeled specifically using the Chemical Ground Form calculus. The methodology is based on the idea of representing a set of experiments, which differ only for the initial concentrations, by abstracting the multiplicity of reagents present in a solution, using intervals. In this paper, we refine the approach in order to address probabilistic termination properties. More in details, we introduce a refinement of the abstract LTS semantics and we abstract the probabilistic semantics using a variant of Interval Markov Chains. The abstract probabilistic model safely approximates a set of concrete experiments and reports conservative lower and upper bounds for probabilistic termination.
Probabilistic programming in Python using PyMC3
Directory of Open Access Journals (Sweden)
John Salvatier
2016-04-01
Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.
Directory of Open Access Journals (Sweden)
Andrej Robida
2004-09-01
Full Text Available Background. The Objective of the article is a two year statistics on sentinel events in hospitals. Results of a survey on sentinel events and the attitude of hospital leaders and staff are also included. Some recommendations regarding patient safety and the handling of sentinel events are given.Methods. In March 2002 the Ministry of Health introduce a voluntary reporting system on sentinel events in Slovenian hospitals. Sentinel events were analyzed according to the place the event, its content, and root causes. To show results of the first year, a conference for hospital directors and medical directors was organized. A survey was conducted among the participants with the purpose of gathering information about their view on sentinel events. One hundred questionnaires were distributed.Results. Sentinel events. There were 14 reports of sentinel events in the first year and 7 in the second. In 4 cases reports were received only after written reminders were sent to the responsible persons, in one case no reports were obtained. There were 14 deaths, 5 of these were in-hospital suicides, 6 were due to an adverse event, 3 were unexplained. Events not leading to death were a suicide attempt, a wrong side surgery, a paraplegia after spinal anaesthesia, a fall with a femoral neck fracture, a damage of the spleen in the event of pleural space drainage, inadvertent embolization with absolute alcohol into a femoral artery and a physical attack on a physician by a patient. Analysis of root causes of sentinel events showed that in most cases processes were inadequate.Survey. One quarter of those surveyed did not know about the sentinel events reporting system. 16% were having actual problems when reporting events and 47% beleived that there was an attempt to blame individuals. Obstacles in reporting events openly were fear of consequences, moral shame, fear of public disclosure of names of participants in the event and exposure in mass media. The majority of
Initiating Events for Multi-Reactor Plant Sites
Energy Technology Data Exchange (ETDEWEB)
Muhlheim, Michael David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Flanagan, George F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Poore, III, Willis P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2014-09-01
Inherent in the design of modular reactors is the increased likelihood of events that initiate at a single reactor affecting another reactor. Because of the increased level of interactions between reactors, it is apparent that the Probabilistic Risk Assessments (PRAs) for modular reactor designs need to specifically address the increased interactions and dependencies.
Migration to Risk Spectrum PSA Level 2 internal events
Energy Technology Data Exchange (ETDEWEB)
Crusells Girona, M.
2012-07-01
The paper is part of the APS model development level 2 internal events. According to the Spanish regulatory requirements and especially the recent safety instruction IS-25, Level 2 models are becoming increasingly important within the probabilistic safety analysis of the Spanish nuclear power plants, thus requiring accurate and efficient implementation such models, together with a systematic analysis thereof.
Non-stationary condition monitoring through event alignment
DEFF Research Database (Denmark)
Pontoppidan, Niels Henrik; Larsen, Jan
2004-01-01
. In this paper we apply the technique for non-stationary condition monitoring of large diesel engines based on acoustical emission sensor signals. The performance of the event alignment is analyzed in an unsupervised probabilistic detection framework based on outlier detection with either Principal Component...
Probabilistic seismic hazard estimates for two cities in Ecuador
Beauval, C.; Yepes, H.; Monelli, D.; Alvarado, A.; Audin, L.
2013-05-01
The whole territory of Ecuador is exposed to seismic hazard. Great earthquakes can occur in the subduction zone (e.g. Esmeraldas, 1906, Mw 8.8), whereas lower magnitude but shallower and potentially more destructive earthquakes can occur in the highlands. This study focuses on the estimation of probabilistic seismic hazard for two cities: the capital Quito (˜2.5 millions inhabitants) in the Interandean Valley, and the city of Esmeraldas on the coast close to the subduction trench (location of the oil refineries and export facilities which are key for Ecuador economy). The analysis relies on a seismotectonic model developed for the Ecuadorian territory and borders (Alvarado, 2012; Yepes et al. in prep). Seismic parameters are determined using a recently published unified earthquake catalog extending over five centuries in the Cordillera region, and over 110 years in the subduction zone (Beauval, Yepes, et al. 2010, 2013). Uncertainties are explored through a simple logic tree. All uncertainties identified in the process are taken into account: source zone limits, recurrence time of large earthquakes, equivalent moment magnitude of historical events, maximum magnitudes, declustering algorithm, decisions for homogenizing magnitudes, seismic parameters, ground-motion prediction equations. The aim is to quantify the resulting uncertainty on the hazard curves and to identify the controlling parameters. Mean hazard estimates for the PGA at 475 years reach around 0.4-0.45g in Quito and 0.9-1.0g in Esmeraldas.
En-INCA: Towards an integrated probabilistic nowcasting system
Suklitsch, Martin; Stuhl, Barbora; Kann, Alexander; Bica, Benedikt
2014-05-01
INCA (Integrated Nowcasting through Comprehensive Analysis), the analysis and nowcasting system operated by ZAMG, is based on blending observations and NWP data. Its performance is extremely high in the nowcasting range. However, uncertainties can be large even in the very short term and limit its practical use. Severe weather conditions are particularly demanding, which is why the quantification of uncertainties and determining probabilities of event occurrences are adding value for various applications. The Nowcasting Ensemble System En-INCA achieves this by coupling the INCA nowcast with ALADIN-LAEF, the EPS of the local area model ALADIN operated at ZAMG successfully for years already. In En-INCA, the Nowcasting approach of INCA is blended with different EPS members in order to derive an ensemble of forecasts in the nowcasting range. In addition to NWP based uncertainties also specific perturbations with respect to observations, the analysis and nowcasting techniques are discussed, and the influence of learning from errors in previous nowcasts is shown. En-INCA is a link between INCA and ALADIN-LAEF by merging the advantages of both systems: observation based nowcasting at very high resolution on the one hand and the uncertainty estimation of a state-of-the-art LAM-EPS on the other hand. Probabilistic nowcasting products can support various end users, e.g. civil protection agencies and power industry, to optimize their decision making process.
The New Algorithm for Fast Probabilistic Hypocenter Locations
Directory of Open Access Journals (Sweden)
Dębski Wojciech
2016-12-01
Full Text Available The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled sources are analysed in many branches of physics, including seismology, oceanology, to name a few. It is well recognised that there is no single universal location algorithm which performs equally well in all situations. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms. In this paper we propose a new location algorithm which exploits the reciprocity and time-inverse invariance property of the wave equation. Basing on these symmetries and using a modern finite-difference-type eikonal solver, we have developed a new very fast algorithm performing the full probabilistic (Bayesian source location. We illustrate an efficiency of the algorithm performing an advanced error analysis for 1647 seismic events from the Rudna copper mine operating in southwestern Poland.
Source processes for the probabilistic assessment of tsunami hazards
Geist, Eric L.; Lynett, Patrick J.
2014-01-01
The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.
Probabilistic hazard for seismically induced tsunamis: accuracy and feasibility of inundation maps
Lorito, S.; Selva, J.; Basili, R.; Romano, F.; Tiberti, M. M.; Piatanesi, A.
2015-01-01
Probabilistic tsunami hazard analysis (PTHA) relies on computationally demanding numerical simulations of tsunami generation, propagation, and non-linear inundation on high-resolution topo-bathymetric models. Here we focus on tsunamis generated by co-seismic sea floor displacement, that is, on Seismic PTHA (SPTHA). A very large number of tsunami simulations are typically needed to incorporate in SPTHA the full expected variability of seismic sources (the aleatory uncertainty). We propose an approach for reducing their number. To this end, we (i) introduce a simplified event tree to achieve an effective and consistent exploration of the seismic source parameter space; (ii) use the computationally inexpensive linear approximation for tsunami propagation to construct a preliminary SPTHA that calculates the probability of maximum offshore tsunami wave height (HMax) at a given target site; (iii) apply a two-stage filtering procedure to these `linear' SPTHA results, for selecting a reduced set of sources and (iv) calculate `non-linear' probabilistic inundation maps at the target site, using only the selected sources. We find that the selection of the important sources needed for approximating probabilistic inundation maps can be obtained based on the offshore HMax values only. The filtering procedure is semi-automatic and can be easily repeated for any target sites. We describe and test the performances of our approach with a case study in the Mediterranean that considers potential subduction earthquakes on a section of the Hellenic Arc, three target sites on the coast of eastern Sicily and one site on the coast of southern Crete. The comparison between the filtered SPTHA results and those obtained for the full set of sources indicates that our approach allows for a 75-80 per cent reduction of the number of the numerical simulations needed, while preserving the accuracy of probabilistic inundation maps to a reasonable degree.
DEFF Research Database (Denmark)
Bækgaard, Lars
2001-01-01
The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics. We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...
DEFF Research Database (Denmark)
Bækgaard, Lars
2001-01-01
The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics.We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...
Extremely Intense Magnetospheric Substorms : External Triggering? Preconditioning?
Tsurutani, Bruce; Echer, Ezequiel; Hajra, Rajkumar
2016-07-01
We study particularly intense substorms using a variety of near-Earth spacecraft data and ground observations. We will relate the solar cycle dependences of events, determine whether the supersubstorms are externally or internally triggered, and their relationship to other factors such as magnetospheric preconditioning. If time permits, we will explore the details of the events and whether they are similar to regular (Akasofu, 1964) substorms or not. These intense substorms are an important feature of space weather since they may be responsible for power outages.
External legitimation in international new ventures
DEFF Research Database (Denmark)
Turcan, Romeo V.
2012-01-01
This paper explores within the framework of new venture legitimation how and why international new ventures acquire external legitimacy and strive for survival in the face of critical events. Following a longitudinal multiple-case study methodology that was adopted for the purpose of theory...... threshold, and on the valuation of the venture are discussed and respective propositions are put forward to guide future research....
A probabilistic method for leak-before-break analysis of CANDU reactor pressure tubes
Energy Technology Data Exchange (ETDEWEB)
Puls, M.P.; Wilkins, B.J.S.; Rigby, G.L. [Whiteshell Labs., Pinawa (Canada)] [and others
1997-04-01
A probabilistic code for the prediction of the cumulative probability of pressure tube ruptures in CANDU type reactors is described. Ruptures are assumed to result from the axial growth by delayed hydride cracking. The BLOOM code models the major phenomena that affect crack length and critical crack length during the reactor sequence of events following the first indications of leakage. BLOOM can be used to develop unit-specific estimates of the actual probability of pressure rupture in operating CANDU reactors and supplement the existing leak before break analysis.
Institute of Scientific and Technical Information of China (English)
Yang Zhen-Biao; Wu Huai-Zhi; Su Wan-Jun; Zhong Zhi-Rong; Zheng Shi-Biao
2007-01-01
This paper shows that, based on the single-photon JC model depicting the resonant interaction of a two-level atom with a single cavity mode, an unknown atomic state and cavity photon superposition state can be faithfully teleported with only a single measurement. The scheme is probabilistic, its success lies on the event that the sender atom (or the medi-atom, for teleportation of cavity field state) is detected in the higher state. The scheme is in contrast to the previous ones of using a maximally two-particle entangled state as quantum channel.
Opportunities of probabilistic flood loss models
Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno
2016-04-01
Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved
Development of probabilistic internal dosimetry computer code
Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki
2017-02-01
Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of
ExternE transport methodology for external cost evaluation of air pollution
DEFF Research Database (Denmark)
Jensen, S. S.; Berkowicz, R.; Brandt, J.
The report describes how the human exposure estimates based on NERI's human exposure modelling system (AirGIS) can improve the Danish data used for exposure factors in the ExternE Transport methodology. Initially, a brief description of the ExternE Tranport methodology is given and it is summaris...
DEFF Research Database (Denmark)
The External Mind: an Introduction by Riccardo Fusaroli, Claudio Paolucci pp. 3-31 The sign of the Hand: Symbolic Practices and the Extended Mind by Massimiliano Cappuccio, Michael Wheeler pp. 33-55 The Overextended Mind by Shaun Gallagher pp. 57-68 The "External Mind": Semiotics, Pragmatism......, Extended Mind and Distributed Cognition by Claudio Paolucci pp. 69-96 The Social Horizon of Embodied Language and Material Symbols by Riccardo Fusaroli pp. 97-123 Semiotics and Theories of Situated/Distributed Action and Cognition: a Dialogue and Many Intersections by Tommaso Granelli pp. 125-167 Building...... Action in Public Environments with Diverse Semiotic Resources by Charles Goodwin pp. 169-182 How Marking in Dance Constitutes Thinking with the Body by David Kirsh pp. 183-214 Ambiguous Coordination: Collaboration in Informal Science Education Research by Ivan Rosero, Robert Lecusay, Michael Cole pp. 215-240...
Angelino, Elaine; Mitzenmacher, Michael; Thaler, Justin
2011-01-01
Many data structures support dictionaries, also known as maps or associative arrays, which store and manage a set of key-value pairs. A \\emph{multimap} is generalization that allows multiple values to be associated with the same key. For example, the inverted file data structure that is used prevalently in the infrastructure supporting search engines is a type of multimap, where words are used as keys and document pointers are used as values. We study the multimap abstract data type and how it can be implemented efficiently online in external memory frameworks, with constant expected I/O performance. The key technique used to achieve our results is a combination of cuckoo hashing using buckets that hold multiple items with a multiqueue implementation to cope with varying numbers of values per key. Our external-memory results are for the standard two-level memory model.
Trending analysis of precursor events
Energy Technology Data Exchange (ETDEWEB)
Watanabe, Norio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment
1998-01-01
The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)
Probabilistic seismic vulnerability and risk assessment of stone masonry structures
Abo El Ezz, Ahmad
Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for
Bloch, Isabelle
2010-01-01
The area of information fusion has grown considerably during the last few years, leading to a rapid and impressive evolution. In such fast-moving times, it is important to take stock of the changes that have occurred. As such, this books offers an overview of the general principles and specificities of information fusion in signal and image processing, as well as covering the main numerical methods (probabilistic approaches, fuzzy sets and possibility theory and belief functions).
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-01-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-01-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…
Reduction Mappings between Probabilistic Boolean Networks
Directory of Open Access Journals (Sweden)
Ivan Ivanov
2004-01-01
Full Text Available Probabilistic Boolean networks (PBNs comprise a model describing a directed graph with rule-based dependences between its nodes. The rules are selected, based on a given probability distribution which provides a flexibility when dealing with the uncertainty which is typical for genetic regulatory networks. Given the computational complexity of the model, the characterization of mappings reducing the size of a given PBN becomes a critical issue. Mappings between PBNs are important also from a theoretical point of view. They provide means for developing a better understanding about the dynamics of PBNs. This paper considers two kinds of mappings reduction and projection and their effect on the original probability structure of a given PBN.
Probabilistic Catalogs for Crowded Stellar Fields
Brewer, Brendon J; Hogg, David W
2012-01-01
We introduce a probabilistic (Bayesian) method for producing catalogs from images of crowded stellar fields. The method is capable of inferring the number of sources (N) in the image and can also handle the challenges introduced by overlapping sources. The luminosity function of the stars can also be inferred even when the precise luminosity of each star is uncertain. This is in contrast with standard techniques which produce a single catalog, potentially underestimating the uncertainties in any study of the stellar population and discarding information about sources at or below the detection limit. The method is implemented using advanced Markov Chain Monte Carlo (MCMC) techniques including Reversible Jump and Nested Sampling. The computational feasibility of the method is demonstrated on simulated data where the luminosity function of the stars is a broken power-law. The parameters of the luminosity function can be recovered with moderate uncertainties. We compare the results obtained from our method with t...
Hierarchical probabilistic inference of cosmic shear
Schneider, Michael D; Marshall, Philip J; Dawson, William A; Meyers, Joshua; Bard, Deborah J; Lang, Dustin
2014-01-01
Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the glo...
Probabilistic approach to study the hydroformed sheet
Directory of Open Access Journals (Sweden)
Mohammed Nassraoui
2015-08-01
Full Text Available Under the leadership of the Kyoto agreements on reducing emissions of greenhouse gases, the automotive sector was forced to review its methods and production technologies in order to meet the new environmental standards. In fuel consumption reduction is an immediate way to reduce the emission of polluting gases. In this paper, the study of the formability of sheet submitted to the hydroforming process is proposed. The numerical results are given to validate the proposed approach. To show the influence of uncertainties in the study process, we take some characteristics of the material as random and the probabilistic approach is done. The finding results are showing the effectiveness of the proposed approach.
Adaptive Probabilistic Flooding for Multipath Routing
Betoule, Christophe; Clavier, Remi; Rossi, Dario; Rossini, Giuseppe; Thouenon, Gilles
2011-01-01
In this work, we develop a distributed source routing algorithm for topology discovery suitable for ISP transport networks, that is however inspired by opportunistic algorithms used in ad hoc wireless networks. We propose a plug-and-play control plane, able to find multiple paths toward the same destination, and introduce a novel algorithm, called adaptive probabilistic flooding, to achieve this goal. By keeping a small amount of state in routers taking part in the discovery process, our technique significantly limits the amount of control messages exchanged with flooding -- and, at the same time, it only minimally affects the quality of the discovered multiple path with respect to the optimal solution. Simple analytical bounds, confirmed by results gathered with extensive simulation on four realistic topologies, show our approach to be of high practical interest.
Probabilistic recognition of human faces from video
DEFF Research Database (Denmark)
Zhou, Saohua; Krüger, Volker; Chellappa, Rama
2003-01-01
Recognition of human faces using a gallery of still or video images and a probe set of videos is systematically investigated using a probabilistic framework. In still-to-video recognition, where the gallery consists of still images, a time series state space model is proposed to fuse temporal...... of the identity variable produces the recognition result. The model formulation is very general and it allows a variety of image representations and transformations. Experimental results using videos collected by NIST/USF and CMU illustrate the effectiveness of this approach for both still-to-video and video...... demonstrate that, due to the propagation of the identity variable over time, a degeneracy in posterior probability of the identity variable is achieved to give improved recognition. The gallery is generalized to videos in order to realize video-to-video recognition. An exemplar-based learning strategy...
Probabilistic solution of relative entropy weighted control
Bierkens, Joris
2012-01-01
We show that stochastic control problems with a particular cost structure involving a relative entropy term admit a purely probabilistic solution, without the necessity of applying the dynamic programming principle. The argument is as follows. Minimization of the expectation of a random variable with respect to the underlying probability measure, penalized by relative entropy, may be solved exactly. In the case where the randomness is generated by a standard Brownian motion, this exact solution can be written as a Girsanov density. The stochastic process appearing in the Girsanov exponent has the role of control process, and the relative entropy of the change of probability measure is equal to the integral of the square of this process. An explicit expression for the control process may be obtained in terms of the Malliavin derivative of the density process. The theory is applied to the problem of minimizing the maximum of a Brownian motion (penalized by the relative entropy), leading to an explicit expressio...
Probabilistic sampling of finite renewal processes
Antunes, Nelson; 10.3150/10-BEJ321
2012-01-01
Consider a finite renewal process in the sense that interrenewal times are positive i.i.d. variables and the total number of renewals is a random variable, independent of interrenewal times. A finite point process can be obtained by probabilistic sampling of the finite renewal process, where each renewal is sampled with a fixed probability and independently of other renewals. The problem addressed in this work concerns statistical inference of the original distributions of the total number of renewals and interrenewal times from a sample of i.i.d. finite point processes obtained by sampling finite renewal processes. This problem is motivated by traffic measurements in the Internet in order to characterize flows of packets (which can be seen as finite renewal processes) and where the use of packet sampling is becoming prevalent due to increasing link speeds and limited storage and processing capacities.