Directory of Open Access Journals (Sweden)
Borna Müller
Full Text Available BACKGROUND: Bovine tuberculosis (BTB today primarily affects developing countries. In Africa, the disease is present essentially on the whole continent; however, little accurate information on its distribution and prevalence is available. Also, attempts to evaluate diagnostic tests for BTB in naturally infected cattle are scarce and mostly complicated by the absence of knowledge of the true disease status of the tested animals. However, diagnostic test evaluation in a given setting is a prerequisite for the implementation of local surveillance schemes and control measures. METHODOLOGY/PRINCIPAL FINDINGS: We subjected a slaughterhouse population of 954 Chadian cattle to single intra-dermal comparative cervical tuberculin (SICCT testing and two recently developed fluorescence polarization assays (FPA. Using a Bayesian modeling approach we computed the receiver operating characteristic (ROC curve of each diagnostic test, the true disease prevalence in the sampled population and the disease status of all sampled animals in the absence of knowledge of the true disease status of the sampled animals. In our Chadian setting, SICCT performed better if the cut-off for positive test interpretation was lowered from >4 mm (OIE standard cut-off to >2 mm. Using this cut-off, SICCT showed a sensitivity and specificity of 66% and 89%, respectively. Both FPA tests showed sensitivities below 50% but specificities above 90%. The true disease prevalence was estimated at 8%. Altogether, 11% of the sampled animals showed gross visible tuberculous lesions. However, modeling of the BTB disease status of the sampled animals indicated that 72% of the suspected tuberculosis lesions detected during standard meat inspections were due to other pathogens than Mycobacterium bovis. CONCLUSIONS/SIGNIFICANCE: Our results have important implications for BTB diagnosis in a high incidence sub-Saharan African setting and demonstrate the practicability of our Bayesian approach for
Bayesian approach and application to operation safety
International Nuclear Information System (INIS)
Procaccia, H.; Suhner, M.Ch.
2003-01-01
The management of industrial risks requires the development of statistical and probabilistic analyses which use all the available convenient information in order to compensate the insufficient experience feedback in a domain where accidents and incidents remain too scarce to perform a classical statistical frequency analysis. The Bayesian decision approach is well adapted to this problem because it integrates both the expertise and the experience feedback. The domain of knowledge is widen, the forecasting study becomes possible and the decisions-remedial actions are strengthen thanks to risk-cost-benefit optimization analyzes. This book presents the bases of the Bayesian approach and its concrete applications in various industrial domains. After a mathematical presentation of the industrial operation safety concepts and of the Bayesian approach principles, this book treats of some of the problems that can be solved thanks to this approach: softwares reliability, controls linked with the equipments warranty, dynamical updating of databases, expertise modeling and weighting, Bayesian optimization in the domains of maintenance, quality control, tests and design of new equipments. A synthesis of the mathematical formulae used in this approach is given in conclusion. (J.S.)
Receiver-based recovery of clipped ofdm signals for papr reduction: A bayesian approach
Ali, Anum; Al-Rabah, Abdullatif R.; Masood, Mudassir; Al-Naffouri, Tareq Y.
2014-01-01
at the receiver for information restoration. In this paper, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the sparsity rate
Bayesian network modeling of operator's state recognition process
International Nuclear Information System (INIS)
Hatakeyama, Naoki; Furuta, Kazuo
2000-01-01
Nowadays we are facing a difficult problem of establishing a good relation between humans and machines. To solve this problem, we suppose that machine system need to have a model of human behavior. In this study we model the state cognition process of a PWR plant operator as an example. We use a Bayesian network as an inference engine. We incorporate the knowledge hierarchy in the Bayesian network and confirm its validity using the example of PWR plant operator. (author)
OFDM receiver for fast time-varying channels using block-sparse Bayesian learning
DEFF Research Database (Denmark)
Barbu, Oana-Elena; Manchón, Carles Navarro; Rom, Christian
2016-01-01
characterized with a basis expansion model using a small number of terms. As a result, the channel estimation problem is posed as that of estimating a vector of complex coefficients that exhibits a block-sparse structure, which we solve with tools from block-sparse Bayesian learning. Using variational Bayesian...... inference, we embed the channel estimator in a receiver structure that performs iterative channel and noise precision estimation, intercarrier interference cancellation, detection and decoding. Simulation results illustrate the superior performance of the proposed receiver over state-of-art receivers....
Use of bayesian operations for diagnosing accidents
International Nuclear Information System (INIS)
Kang, K.M.; Jae, M.; Suh, K.Y.
2005-01-01
In complex systems, it is necessary to model a logical representation of the overall system interaction with respect to the individual subsystems. Operators are allowed to follow EOPs (Emergency Operating Procedures) when reactor tripped because of accidents. But, it's very difficult to diagnose accidents and find out appropriate procedures to mitigate current accidents in a given short time. Even if they diagnose accidents, it also has possibility to misdiagnose. TMI accident is a good example of operators' errors. Methodology using Influence Diagrams has been developed and applied for representing the dependency behaviors and uncertain behaviors of complex systems. An example to diagnose the accidents such as SLOCA and SGTR with similar symptoms has been introduced. From the constructed model, operators could diagnose accidents at any states of accidents. This model can offer the information about accidents with given symptoms. This model might help operators to diagnose correctly and rapidly. It might be very useful to support operators to reduce human error. Also, from this study, it is applicable to diagnose other accidents with similar symptoms and to analyze causes of reactor trip. (authors)
Operational modal analysis modeling, Bayesian inference, uncertainty laws
Au, Siu-Kui
2017-01-01
This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...
Modeling operational risks of the nuclear industry with Bayesian networks
International Nuclear Information System (INIS)
Wieland, Patricia; Lustosa, Leonardo J.
2009-01-01
Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)
Modeling operational risks of the nuclear industry with Bayesian networks
Energy Technology Data Exchange (ETDEWEB)
Wieland, Patricia [Pontificia Univ. Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil). Dept. de Engenharia Industrial; Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: pwieland@cnen.gov.br; Lustosa, Leonardo J. [Pontificia Univ. Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil). Dept. de Engenharia Industrial], e-mail: ljl@puc-rio.br
2009-07-01
Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)
Bayesian Recovery of Clipped OFDM Signals: A Receiver-based Approach
Al-Rabah, Abdullatif R.
2013-05-01
Recently, orthogonal frequency-division multiplexing (OFDM) has been adopted for high-speed wireless communications due to its robustness against multipath fading. However, one of the main fundamental drawbacks of OFDM systems is the high peak-to-average-power ratio (PAPR). Several techniques have been proposed for PAPR reduction. Most of these techniques require transmitter-based (pre-compensated) processing. On the other hand, receiver-based alternatives would save the power and reduce the transmitter complexity. By keeping this in mind, a possible approach is to limit the amplitude of the OFDM signal to a predetermined threshold and equivalently a sparse clipping signal is added. Then, estimating this clipping signal at the receiver to recover the original signal. In this work, we propose a Bayesian receiver-based low-complexity clipping signal recovery method for PAPR reduction. The method is able to i) effectively reduce the PAPR via simple clipping scheme at the transmitter side, ii) use Bayesian recovery algorithm to reconstruct the clipping signal at the receiver side by measuring part of subcarriers, iii) perform well in the absence of statistical information about the signal (e.g. clipping level) and the noise (e.g. noise variance), and at the same time iv is energy efficient due to its low complexity. Specifically, the proposed recovery technique is implemented in data-aided based. The data-aided method collects clipping information by measuring reliable data subcarriers, thus makes full use of spectrum for data transmission without the need for tone reservation. The study is extended further to discuss how to improve the recovery of the clipping signal utilizing some features of practical OFDM systems i.e., the oversampling and the presence of multiple receivers. Simulation results demonstrate the superiority of the proposed technique over other recovery algorithms. The overall objective is to show that the receiver-based Bayesian technique is highly
Receiver operator characteristic (ROC) analysis without truth
International Nuclear Information System (INIS)
Henkelman, R.M.; Kay, I.; Bronskill, M.J.
1990-01-01
Receiver operator characteristic (ROC) analysis, the preferred method of evaluating diagnostic imaging tests, requires an independent assessment of the true state of disease, which can be difficult to obtain and is often of questionable accuracy. A new method of analysis is described which does not require independent truth data and which can be used when several accurate tests are being compared. This method uses correlative information to estimate the underlying model of multivariate normal distributions of disease-positive and disease-negative patients. The method is shown to give results equivalent to conventional ROC analysis in a comparison of computed tomography, radionuclide scintigraphy, and magnetic resonance imaging for liver metastasis. When independent truth is available, the method can be extended to incorporate truth data or to evaluate the consistency of the truth data with the imaging data
Sequential Bayesian geoacoustic inversion for mobile and compact source-receiver configuration.
Carrière, Olivier; Hermand, Jean-Pierre
2012-04-01
Geoacoustic characterization of wide areas through inversion requires easily deployable configurations including free-drifting platforms, underwater gliders and autonomous vehicles, typically performing repeated transmissions during their course. In this paper, the inverse problem is formulated as sequential Bayesian filtering to take advantage of repeated transmission measurements. Nonlinear Kalman filters implement a random-walk model for geometry and environment and an acoustic propagation code in the measurement model. Data from MREA/BP07 sea trials are tested consisting of multitone and frequency-modulated signals (bands: 0.25-0.8 and 0.8-1.6 kHz) received on a shallow vertical array of four hydrophones 5-m spaced drifting over 0.7-1.6 km range. Space- and time-coherent processing are applied to the respective signal types. Kalman filter outputs are compared to a sequence of global optimizations performed independently on each received signal. For both signal types, the sequential approach is more accurate but also more efficient. Due to frequency diversity, the processing of modulated signals produces a more stable tracking. Although an extended Kalman filter provides comparable estimates of the tracked parameters, the ensemble Kalman filter is necessary to properly assess uncertainty. In spite of mild range dependence and simplified bottom model, all tracked geoacoustic parameters are consistent with high-resolution seismic profiling, core logging P-wave velocity, and previous inversion results with fixed geometries.
Receiver-based recovery of clipped ofdm signals for papr reduction: A bayesian approach
Ali, Anum
2014-01-01
Clipping is one of the simplest peak-to-average power ratio reduction schemes for orthogonal frequency division multiplexing (OFDM). Deliberately clipping the transmission signal degrades system performance, and clipping mitigation is required at the receiver for information restoration. In this paper, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the sparsity rate and noise variance for enhanced recovery. At the same time, the proposed scheme is robust against inaccurate estimates of the clipping signal statistics. The undistorted phase property of the clipped signal, as well as the clipping likelihood, is utilized for enhanced reconstruction. Furthermore, motivated by the nature of modern OFDM-based communication systems, we extend our clipping reconstruction approach to multiple antenna receivers and multi-user OFDM.We also address the problem of channel estimation from pilots contaminated by the clipping distortion. Numerical findings are presented that depict favorable results for the proposed scheme compared to the established sparse reconstruction schemes.
Smooth time-dependent receiver operating characteristic curve estimators.
Martínez-Camblor, Pablo; Pardo-Fernández, Juan Carlos
2018-03-01
The receiver operating characteristic curve is a popular graphical method often used to study the diagnostic capacity of continuous (bio)markers. When the considered outcome is a time-dependent variable, two main extensions have been proposed: the cumulative/dynamic receiver operating characteristic curve and the incident/dynamic receiver operating characteristic curve. In both cases, the main problem for developing appropriate estimators is the estimation of the joint distribution of the variables time-to-event and marker. As usual, different approximations lead to different estimators. In this article, the authors explore the use of a bivariate kernel density estimator which accounts for censored observations in the sample and produces smooth estimators of the time-dependent receiver operating characteristic curves. The performance of the resulting cumulative/dynamic and incident/dynamic receiver operating characteristic curves is studied by means of Monte Carlo simulations. Additionally, the influence of the choice of the required smoothing parameters is explored. Finally, two real-applications are considered. An R package is also provided as a complement to this article.
Receiver Operating Characteristics (ROCs) in Recognition Memory: A Review
Yonelinas, Andrew P.; Parks, Colleen M.
2007-01-01
Receiver operating characteristic (ROC) analysis is being used increasingly to examine the memory processes underlying recognition memory. The authors discuss the methodological issues involved in conducting and analyzing ROC results, describe the various models that have been developed to account for these results, review the behavioral empirical…
Dynamic Bayesian modeling for risk prediction in credit operations
DEFF Research Database (Denmark)
Borchani, Hanen; Martinez, Ana Maria; Masegosa, Andres
2015-01-01
Our goal is to do risk prediction in credit operations, and as data is collected continuously and reported on a monthly basis, this gives rise to a streaming data classification problem. Our analysis reveals some practical problems that have not previously been thoroughly analyzed in the context...
Bayesian Recovery of Clipped OFDM Signals: A Receiver-based Approach
Al-Rabah, Abdullatif R.
2013-01-01
recovery algorithm to reconstruct the clipping signal at the receiver side by measuring part of subcarriers, iii) perform well in the absence of statistical information about the signal (e.g. clipping level) and the noise (e.g. noise variance
Sebastian, Nita; Kim, Seongryong; Tkalčić, Hrvoje; Sippl, Christian
2017-04-01
The purpose of this study is to develop an integrated inference on the lithospheric structure of NE China using three passive seismic networks comprised of 92 stations. The NE China plain consists of complex lithospheric domains characterised by the co-existence of complex geodynamic processes such as crustal thinning, active intraplate cenozoic volcanism and low velocity anomalies. To estimate lithospheric structures with greater detail, we chose to perform the joint inversion of independent data sets such as receiver functions and surface wave dispersion curves (group and phase velocity). We perform a joint inversion based on principles of Bayesian transdimensional optimisation techniques (Kim etal., 2016). Unlike in the previous studies of NE China, the complexity of the model is determined from the data in the first stage of the inversion, and the data uncertainty is computed based on Bayesian statistics in the second stage of the inversion. The computed crustal properties are retrieved from an ensemble of probable models. We obtain major structural inferences with well constrained absolute velocity estimates, which are vital for inferring properties of the lithosphere and bulk crustal Vp/Vs ratio. The Vp/Vs estimate obtained from joint inversions confirms the high Vp/Vs ratio ( 1.98) obtained using the H-Kappa method beneath some stations. Moreover, we could confirm the existence of a lower crustal velocity beneath several stations (eg: station SHS) within the NE China plain. Based on these findings we attempt to identify a plausible origin for structural complexity. We compile a high-resolution 3D image of the lithospheric architecture of the NE China plain.
Impact of signal scattering and parametric uncertainties on receiver operating characteristics
Wilson, D. Keith; Breton, Daniel J.; Hart, Carl R.; Pettit, Chris L.
2017-05-01
The receiver operating characteristic (ROC curve), which is a plot of the probability of detection as a function of the probability of false alarm, plays a key role in the classical analysis of detector performance. However, meaningful characterization of the ROC curve is challenging when practically important complications such as variations in source emissions, environmental impacts on the signal propagation, uncertainties in the sensor response, and multiple sources of interference are considered. In this paper, a relatively simple but realistic model for scattered signals is employed to explore how parametric uncertainties impact the ROC curve. In particular, we show that parametric uncertainties in the mean signal and noise power substantially raise the tails of the distributions; since receiver operation with a very low probability of false alarm and a high probability of detection is normally desired, these tails lead to severely degraded performance. Because full a priori knowledge of such parametric uncertainties is rarely available in practice, analyses must typically be based on a finite sample of environmental states, which only partially characterize the range of parameter variations. We show how this effect can lead to misleading assessments of system performance. For the cases considered, approximately 64 or more statistically independent samples of the uncertain parameters are needed to accurately predict the probabilities of detection and false alarm. A connection is also described between selection of suitable distributions for the uncertain parameters, and Bayesian adaptive methods for inferring the parameters.
Wavelet and receiver operating characteristic analysis of heart rate variability
McCaffery, G.; Griffith, T. M.; Naka, K.; Frennaux, M. P.; Matthai, C. C.
2002-02-01
Multiresolution wavelet analysis has been used to study the heart rate variability in two classes of patients with different pathological conditions. The scale dependent measure of Thurner et al. was found to be statistically significant in discriminating patients suffering from hypercardiomyopathy from a control set of normal subjects. We have performed Receiver Operating Characteristc (ROC) analysis and found the ROC area to be a useful measure by which to label the significance of the discrimination, as well as to describe the severity of heart dysfunction.
International Nuclear Information System (INIS)
Lin, Yufei; Chen, Maoyin; Zhou, Donghua
2013-01-01
In the past decades, engineering systems become more and more complex, and generally work at different operational modes. Since incipient fault can lead to dangerous accidents, it is crucial to develop strategies for online operational safety assessment. However, the existing online assessment methods for multi-mode engineering systems commonly assume that samples are independent, which do not hold for practical cases. This paper proposes a probabilistic framework of online operational safety assessment of multi-mode engineering systems with sample dependency. To begin with, a Gaussian mixture model (GMM) is used to characterize multiple operating modes. Then, based on the definition of safety index (SI), the SI for one single mode is calculated. At last, the Bayesian method is presented to calculate the posterior probabilities belonging to each operating mode with sample dependency. The proposed assessment strategy is applied in two examples: one is the aircraft gas turbine, another is an industrial dryer. Both examples illustrate the efficiency of the proposed method
A Three-Dimensional Receiver Operator Characteristic Surface Diagnostic Metric
Simon, Donald L.
2011-01-01
Receiver Operator Characteristic (ROC) curves are commonly applied as metrics for quantifying the performance of binary fault detection systems. An ROC curve provides a visual representation of a detection system s True Positive Rate versus False Positive Rate sensitivity as the detection threshold is varied. The area under the curve provides a measure of fault detection performance independent of the applied detection threshold. While the standard ROC curve is well suited for quantifying binary fault detection performance, it is not suitable for quantifying the classification performance of multi-fault classification problems. Furthermore, it does not provide a measure of diagnostic latency. To address these shortcomings, a novel three-dimensional receiver operator characteristic (3D ROC) surface metric has been developed. This is done by generating and applying two separate curves: the standard ROC curve reflecting fault detection performance, and a second curve reflecting fault classification performance. A third dimension, diagnostic latency, is added giving rise to 3D ROC surfaces. Applying numerical integration techniques, the volumes under and between the surfaces are calculated to produce metrics of the diagnostic system s detection and classification performance. This paper will describe the 3D ROC surface metric in detail, and present an example of its application for quantifying the performance of aircraft engine gas path diagnostic methods. Metric limitations and potential enhancements are also discussed
Receiver Operating Characteristic Analysis for Detecting Explosives-related Threats
Energy Technology Data Exchange (ETDEWEB)
Oxley, Mark E; Venzin, Alexander M
2012-11-14
The Department of Homeland Security (DHS) and the Transportation Security Administration (TSA) are interested in developing a standardized testing procedure for determining the performance of candidate detection systems. This document outlines a potential method for judging detection system performance as well as determining if combining the information from a legacy system with a new system can signicantly improve performance. In this document, performance corresponds to the Neyman-Pearson criterion applied to the Receiver Operating Characteristic (ROC) curves of the detection systems in question. A simulation was developed to investigate how the amount of data provided by the vendor in the form of the ROC curve eects the performance of the combined detection system. Furthermore, the simulation also takes into account the potential eects of correlation and how this information can also impact the performance of the combined system.
Risk-Based Operation and Maintenance of Offshore Wind Turbines using Bayesian Networks
DEFF Research Database (Denmark)
Nielsen, Jannie Jessen; Sørensen, John Dalsgaard
2011-01-01
the lifetime. Two different approaches are used; one uses a threshold value of the failure probability, and one uses a Limited Memory Influence Diagram. Both methods are tested for an application example using MonteCarlo sampling, and they are both found to be efficient and equally good.......For offshore wind farms, the costs due to operation and maintenance are large, and more optimal planning has the potential of reducing these costs. This paper presents how Bayesian networks can be used for risk-based inspection planning, where the inspection plans are updated each year through...
Energy Technology Data Exchange (ETDEWEB)
Dongiovanni, Danilo Nicola, E-mail: danilo.dongiovanni@enea.it [ENEA, Nuclear Fusion and Safety Technologies Department, via Enrico Fermi 45, Frascati 00040 (Italy); Iesmantas, Tomas [LEI, Breslaujos str. 3 Kaunas (Lithuania)
2016-11-01
Highlights: • RAMI (Reliability, Availability, Maintainability and Inspectability) assessment of secondary heat transfer loop for a DEMO nuclear fusion plant. • Definition of a fault tree for a nuclear steam turbine operated in pulsed mode. • Turbine failure rate models update by mean of a Bayesian network reflecting the fault tree analysis in the considered scenario. • Sensitivity analysis on system availability performance. - Abstract: Availability will play an important role in the Demonstration Power Plant (DEMO) success from an economic and safety perspective. Availability performance is commonly assessed by Reliability Availability Maintainability Inspectability (RAMI) analysis, strongly relying on the accurate definition of system components failure modes (FM) and failure rates (FR). Little component experience is available in fusion application, therefore requiring the adaptation of literature FR to fusion plant operating conditions, which may differ in several aspects. As a possible solution to this problem, a new methodology to extrapolate/estimate components failure rate under different operating conditions is presented. The DEMO Balance of Plant nuclear steam turbine component operated in pulse mode is considered as study case. The methodology moves from the definition of a fault tree taking into account failure modes possibly enhanced by pulsed operation. The fault tree is then translated into a Bayesian network. A statistical model for the turbine system failure rate in terms of subcomponents’ FR is hence obtained, allowing for sensitivity analyses on the structured mixture of literature and unknown FR data for which plausible value intervals are investigated to assess their impact on the whole turbine system FR. Finally, the impact of resulting turbine system FR on plant availability is assessed exploiting a Reliability Block Diagram (RBD) model for a typical secondary cooling system implementing a Rankine cycle. Mean inherent availability
International Nuclear Information System (INIS)
Dongiovanni, Danilo Nicola; Iesmantas, Tomas
2016-01-01
Highlights: • RAMI (Reliability, Availability, Maintainability and Inspectability) assessment of secondary heat transfer loop for a DEMO nuclear fusion plant. • Definition of a fault tree for a nuclear steam turbine operated in pulsed mode. • Turbine failure rate models update by mean of a Bayesian network reflecting the fault tree analysis in the considered scenario. • Sensitivity analysis on system availability performance. - Abstract: Availability will play an important role in the Demonstration Power Plant (DEMO) success from an economic and safety perspective. Availability performance is commonly assessed by Reliability Availability Maintainability Inspectability (RAMI) analysis, strongly relying on the accurate definition of system components failure modes (FM) and failure rates (FR). Little component experience is available in fusion application, therefore requiring the adaptation of literature FR to fusion plant operating conditions, which may differ in several aspects. As a possible solution to this problem, a new methodology to extrapolate/estimate components failure rate under different operating conditions is presented. The DEMO Balance of Plant nuclear steam turbine component operated in pulse mode is considered as study case. The methodology moves from the definition of a fault tree taking into account failure modes possibly enhanced by pulsed operation. The fault tree is then translated into a Bayesian network. A statistical model for the turbine system failure rate in terms of subcomponents’ FR is hence obtained, allowing for sensitivity analyses on the structured mixture of literature and unknown FR data for which plausible value intervals are investigated to assess their impact on the whole turbine system FR. Finally, the impact of resulting turbine system FR on plant availability is assessed exploiting a Reliability Block Diagram (RBD) model for a typical secondary cooling system implementing a Rankine cycle. Mean inherent availability
International Nuclear Information System (INIS)
Zheng, Xiaoyu; Ishikawa, Jun; Sugiyama, Tomoyuki; Maryyama, Yu
2017-01-01
Containment venting is one of several essential measures to protect the integrity of the final barrier of a nuclear reactor during severe accidents, by which the uncontrollable release of fission products can be avoided. The authors seek to develop an optimization approach to venting operations, from a simulation-based perspective, using an integrated severe accident code, THALES2/KICHE. The effectiveness of the containment-venting strategies needs to be verified via numerical simulations based on various settings of the venting conditions. The number of iterations, however, needs to be controlled to avoid cumbersome computational burden of integrated codes. Bayesian optimization is an efficient global optimization approach. By using a Gaussian process regression, a surrogate model of the “black-box” code is constructed. It can be updated simultaneously whenever new simulation results are acquired. With predictions via the surrogate model, upcoming locations of the most probable optimum can be revealed. The sampling procedure is adaptive. Compared with the case of pure random searches, the number of code queries is largely reduced for the optimum finding. One typical severe accident scenario of a boiling water reactor is chosen as an example. The research demonstrates the applicability of the Bayesian optimization approach to the design and establishment of containment-venting strategies during severe accidents
Energy Technology Data Exchange (ETDEWEB)
Zheng, Xiaoyu; Ishikawa, Jun; Sugiyama, Tomoyuki; Maryyama, Yu [Nuclear Safety Research Center, Japan Atomic Energy Agency, Ibaraki (Japan)
2017-03-15
Containment venting is one of several essential measures to protect the integrity of the final barrier of a nuclear reactor during severe accidents, by which the uncontrollable release of fission products can be avoided. The authors seek to develop an optimization approach to venting operations, from a simulation-based perspective, using an integrated severe accident code, THALES2/KICHE. The effectiveness of the containment-venting strategies needs to be verified via numerical simulations based on various settings of the venting conditions. The number of iterations, however, needs to be controlled to avoid cumbersome computational burden of integrated codes. Bayesian optimization is an efficient global optimization approach. By using a Gaussian process regression, a surrogate model of the “black-box” code is constructed. It can be updated simultaneously whenever new simulation results are acquired. With predictions via the surrogate model, upcoming locations of the most probable optimum can be revealed. The sampling procedure is adaptive. Compared with the case of pure random searches, the number of code queries is largely reduced for the optimum finding. One typical severe accident scenario of a boiling water reactor is chosen as an example. The research demonstrates the applicability of the Bayesian optimization approach to the design and establishment of containment-venting strategies during severe accidents.
An extension of the receiver operating characteristic curve and AUC-optimal classification.
Takenouchi, Takashi; Komori, Osamu; Eguchi, Shinto
2012-10-01
While most proposed methods for solving classification problems focus on minimization of the classification error rate, we are interested in the receiver operating characteristic (ROC) curve, which provides more information about classification performance than the error rate does. The area under the ROC curve (AUC) is a natural measure for overall assessment of a classifier based on the ROC curve. We discuss a class of concave functions for AUC maximization in which a boosting-type algorithm including RankBoost is considered, and the Bayesian risk consistency and the lower bound of the optimum function are discussed. A procedure derived by maximizing a specific optimum function has high robustness, based on gross error sensitivity. Additionally, we focus on the partial AUC, which is the partial area under the ROC curve. For example, in medical screening, a high true-positive rate to the fixed lower false-positive rate is preferable and thus the partial AUC corresponding to lower false-positive rates is much more important than the remaining AUC. We extend the class of concave optimum functions for partial AUC optimality with the boosting algorithm. We investigated the validity of the proposed method through several experiments with data sets in the UCI repository.
Receiver operating characteristic analysis improves diagnosis by radionuclide ventriculography
International Nuclear Information System (INIS)
Dickinson, C.Z.; Forman, M.B.; Vaugh, W.K.; Sandler, M.P.; Kronenberg, M.W.
1985-01-01
Receiver operating characteristic analysis (ROC) evaluates continuous variables to define diagnostic criteria for the optimal sensitivity (SENS) and specificity (SPEC) of a test. The authors studied exercise-induced chest pain (CP), ST-changes on electrocardiography (ECG) and rest-exercise gated radionuclide ventriculography (RVG) using ROC to clarify the optimal criteria for detecting myocardial ischemia due to coronary artherosclerosis (CAD). The data of 95 consecutive patients studied with coronary angiography, rest-exercise RVG and ECG were reviewed. 77 patients had ''significant'' CAD (≥50% lesions). Exercise-induced CP, ECG abnormalities (ST-T shifts) and RVG abnormalities (change in ejection fraction, 2-view regional wall motion change and relative end-systolic volume) were evaluated to define optimal SENS/SPEC of each and for the combined data. ROC curves were constructed by multiple logistic regression (MLR). By MLR, RVG alone was superior to ECG and CP. The combination of all three produced the best ROC curve for the entire group and for clinical subsets based on the number of diseased vessels and the presence or absence of prior myocardial infarction. When CP, ECG and RVG were combined, the optimal SENS/SPEC for detection of single vessel disease was 88/86. The SENS/SPEC for 3 vessel disease was 93/95. Thus, the application of RVG for the diagnosis of myocardial ischemia is improved with the inclusion of ECG and CP data by the use of a multiple logistic regression model. ROC analysis allows clinical application of multiple data for diagnosing CAD at desired SENS/SPEC rather than by arbitrary single-standard criteria
Autonomous Operation of Super-Regenerative Receiver in BAN
Kalyanasundaram, P.; Huang, L.; Dolmans, G.; Imamura, K.
2012-01-01
Super-regenerative receiver is one of the potential candidates to achieve ultra low power wireless communication in body area network (BAN). The main limitations of the super-regenerative receiver include the difficulty in choosing a good quench waveform to optimize its sensitivity and selectivity,
International Nuclear Information System (INIS)
Martins, Marcelo Ramos; Maturana, Marcos Coelho
2013-01-01
During the last three decades, several techniques have been developed for the quantitative study of human reliability. In the 1980s, techniques were developed to model systems by means of binary trees, which did not allow for the representation of the context in which human actions occur. Thus, these techniques cannot model the representation of individuals, their interrelationships, and the dynamics of a system. These issues make the improvement of methods for Human Reliability Analysis (HRA) a pressing need. To eliminate or at least attenuate these limitations, some authors have proposed modeling systems using Bayesian Belief Networks (BBNs). The application of these tools is expected to address many of the deficiencies in current approaches to modeling human actions with binary trees. This paper presents a methodology based on BBN for analyzing human reliability and applies this method to the operation of an oil tanker, focusing on the risk of collision accidents. The obtained model was used to determine the most likely sequence of hazardous events and thus isolate critical activities in the operation of the ship to study Internal Factors (IFs), Skills, and Management and Organizational Factors (MOFs) that should receive more attention for risk reduction.
Partial inversion of elliptic operator to speed up computation of likelihood in Bayesian inference
Litvinenko, Alexander
2017-08-09
In this paper, we speed up the solution of inverse problems in Bayesian settings. By computing the likelihood, the most expensive part of the Bayesian formula, one compares the available measurement data with the simulated data. To get simulated data, repeated solution of the forward problem is required. This could be a great challenge. Often, the available measurement is a functional $F(u)$ of the solution $u$ or a small part of $u$. Typical examples of $F(u)$ are the solution in a point, solution on a coarser grid, in a small subdomain, the mean value in a subdomain. It is a waste of computational resources to evaluate, first, the whole solution and then compute a part of it. In this work, we compute the functional $F(u)$ direct, without computing the full inverse operator and without computing the whole solution $u$. The main ingredients of the developed approach are the hierarchical domain decomposition technique, the finite element method and the Schur complements. To speed up computations and to reduce the storage cost, we approximate the forward operator and the Schur complement in the hierarchical matrix format. Applying the hierarchical matrix technique, we reduced the computing cost to $\\\\mathcal{O}(k^2n \\\\log^2 n)$, where $k\\\\ll n$ and $n$ is the number of degrees of freedom. Up to the $\\\\H$-matrix accuracy, the computation of the functional $F(u)$ is exact. To reduce the computational resources further, we can approximate $F(u)$ on, for instance, multiple coarse meshes. The offered method is well suited for solving multiscale problems. A disadvantage of this method is the assumption that one has to have access to the discretisation and to the procedure of assembling the Galerkin matrix.
Post-operative neuromuscular function of patients receiving non ...
African Journals Online (AJOL)
Objectives: To determine the number of patients whose non-depolarising muscle relaxation is adequately reversed. To define factors that contribute to reversal. Design: A cross sectional study. Setting: Universitas Hospital recovery room over a 2 month period. Subjects: Patients that received non-depolarising muscle ...
Energy Technology Data Exchange (ETDEWEB)
Kang, Seongkeun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)
2014-05-15
The purpose of this paper is to confirm if Bayesian inference can properly reflect the situation awareness of real human operators, and find the difference between the situation of ideal and practical operators, and investigate the factors which contributes to those difference. As a results, human can not think like computer. If human can memorize all the information, and their thinking process is same to the CPU of computer, the results of these two experiments come out more than 99%. However the probability of finding right malfunction by humans are only 64.52% in simple experiment, and 51.61% in complex experiment. Cognition is the mental processing that includes the attention of working memory, comprehending and producing language, calculating, reasoning, problem solving, and decision making. There are many reasons why human thinking process is different with computer, but in this experiment, we suggest that the working memory is the most important factor. Humans have limited working memory which has only seven chunks capacity. These seven chunks are called magic number. If there are more than seven sequential information, people start to forget the previous information because their working memory capacity is running over. We can check how much working memory affects to the result through the simple experiment. Then what if we neglect the effect of working memory? The total number of subjects who have incorrect memory is 7 (subject 3, 5, 6, 7, 8, 15, 25). They could find the right malfunction if the memory hadn't changed because of lack of working memory. Then the probability of find correct malfunction will be increased to 87.10% from 64.52%. Complex experiment has similar result. In this case, eight subjects(1, 5, 8, 9, 15, 17, 18, 30) had changed the memory, and it affects to find the right malfunction. Considering it, then the probability would be (16+8)/31 = 77.42%.
International Nuclear Information System (INIS)
Kang, Seongkeun; Seong, Poong Hyun
2014-01-01
The purpose of this paper is to confirm if Bayesian inference can properly reflect the situation awareness of real human operators, and find the difference between the situation of ideal and practical operators, and investigate the factors which contributes to those difference. As a results, human can not think like computer. If human can memorize all the information, and their thinking process is same to the CPU of computer, the results of these two experiments come out more than 99%. However the probability of finding right malfunction by humans are only 64.52% in simple experiment, and 51.61% in complex experiment. Cognition is the mental processing that includes the attention of working memory, comprehending and producing language, calculating, reasoning, problem solving, and decision making. There are many reasons why human thinking process is different with computer, but in this experiment, we suggest that the working memory is the most important factor. Humans have limited working memory which has only seven chunks capacity. These seven chunks are called magic number. If there are more than seven sequential information, people start to forget the previous information because their working memory capacity is running over. We can check how much working memory affects to the result through the simple experiment. Then what if we neglect the effect of working memory? The total number of subjects who have incorrect memory is 7 (subject 3, 5, 6, 7, 8, 15, 25). They could find the right malfunction if the memory hadn't changed because of lack of working memory. Then the probability of find correct malfunction will be increased to 87.10% from 64.52%. Complex experiment has similar result. In this case, eight subjects(1, 5, 8, 9, 15, 17, 18, 30) had changed the memory, and it affects to find the right malfunction. Considering it, then the probability would be (16+8)/31 = 77.42%
Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad
2016-09-01
Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.
International Nuclear Information System (INIS)
1985-10-01
The operating characteristics of a repository receiving facility structured around current technology and practices have been reviewed. Cask turnaround times and operator doses were estimated. Large throughout and long-term receiving operations at a nuclear waste repository result in an unprecedented number of casks being handled. While the current generation of material-handling equipment is adequate to process the casks, personnel radiation exposures for the generic facility analyzed are unacceptably high. This emphasizes the need for development of occupational radiation exposure control concepts for application in repository receiving facilities. 3 refs., 22 figs., 6 tabs
Energy Technology Data Exchange (ETDEWEB)
Kang, Seong Keun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)
2014-08-15
Bayesian methodology has been used widely used in various research fields. It is method of inference using Bayes' rule to update the estimation of probability for the certain hypothesis when additional evidences are acquired. According to the current researches, malfunction of nuclear power plant can be detected by using this Bayesian inference which consistently piles up the newly incoming data and updates its estimation. However, those researches are based on the assumption that people are doing like computer perfectly, which can be criticized and may cause a problem in real world application. Studies in cognitive psychology indicates that when the amount of information becomes larger, people can't save the whole data because people have limited memory capacity which is well known as working memory, and also they have attention problem. The purpose of this paper is to consider the psychological factors and confirm how much this working memory and attention will affect the resulted estimation based on the Bayesian inference. To confirm this, experiment on human is needed, and the tool of experiment is Compact Nuclear Simulator (CNS)
International Nuclear Information System (INIS)
Kang, Seong Keun; Seong, Poong Hyun
2014-01-01
Bayesian methodology has been used widely used in various research fields. It is method of inference using Bayes' rule to update the estimation of probability for the certain hypothesis when additional evidences are acquired. According to the current researches, malfunction of nuclear power plant can be detected by using this Bayesian inference which consistently piles up the newly incoming data and updates its estimation. However, those researches are based on the assumption that people are doing like computer perfectly, which can be criticized and may cause a problem in real world application. Studies in cognitive psychology indicates that when the amount of information becomes larger, people can't save the whole data because people have limited memory capacity which is well known as working memory, and also they have attention problem. The purpose of this paper is to consider the psychological factors and confirm how much this working memory and attention will affect the resulted estimation based on the Bayesian inference. To confirm this, experiment on human is needed, and the tool of experiment is Compact Nuclear Simulator (CNS)
47 CFR 25.220 - Non-conforming transmit/receive earth station operations.
2010-10-01
... 47 Telecommunication 2 2010-10-01 2010-10-01 false Non-conforming transmit/receive earth station operations. 25.220 Section 25.220 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Technical Standards § 25.220 Non-conforming transmit/receive...
Robust Biometric Score Fusion by Naive Likelihood Ratio via Receiver Operating Characteristics
Tao, Q.; Veldhuis, Raymond N.J.
This paper presents a novel method of fusing multiple biometrics on the matching score level. We estimate the likelihood ratios of the fused biometric scores, via individual receiver operating characteristics (ROC) which construct the Naive Bayes classifier. Using a limited number of operation
Correlator receiver architecture with PnpN optical thyristor operating as optical hard-limiter
Kang, Tae-Gu; Ho Lee, Su; Park, Soonchul
2011-07-01
We propose novel correlator receiver architecture with a PnpN optical thyristor operating as optical hard-limiter, and demonstrate a multiple-access interference rejection of the proposed correlator receiver. The proposed correlator receiver is composed of the 1×2 splitter, optical delay line, 2×1 combiner, and fabricated PnpN optical thyristor. The proposed correlator receiver enhances the system performance because it excludes some combinations of multiple-access interference patterns from causing errors as in optical code-division multiple access systems with conventional optical receiver shown in all previous works. It is found that the proposed correlator receiver can fully reject the interference signals generated by decoding processing and multiple access for two simultaneous users.
Nishikawa, Hiroki; Nishijima, Norihiro; Enomoto, Hirayuki; Sakamoto, Azusa; Nasu, Akihiro; Komekado, Hideyuki; Nishimura, Takashi; Kita, Ryuichi; Kimura, Toru; Iijima, Hiroko; Nishiguchi, Shuhei; Osaki, Yukio
2017-01-01
To investigate variables before sorafenib therapy on the clinical outcomes in hepatocellular carcinoma (HCC) patients receiving sorafenib and to further assess and compare the predictive performance of continuous parameters using time-dependent receiver operating characteristics (ROC) analysis. A total of 225 HCC patients were analyzed. We retrospectively examined factors related to overall survival (OS) and progression free survival (PFS) using univariate and multivariate analyses. Subsequently, we performed time-dependent ROC analysis of continuous parameters which were significant in the multivariate analysis in terms of OS and PFS. Total sum of area under the ROC in all time points (defined as TAAT score) in each case was calculated. Our cohort included 175 male and 50 female patients (median age, 72 years) and included 158 Child-Pugh A and 67 Child-Pugh B patients. The median OS time was 0.68 years, while the median PFS time was 0.24 years. On multivariate analysis, gender, body mass index (BMI), Child-Pugh classification, extrahepatic metastases, tumor burden, aspartate aminotransferase (AST) and alpha-fetoprotein (AFP) were identified as significant predictors of OS and ECOG-performance status, Child-Pugh classification and extrahepatic metastases were identified as significant predictors of PFS. Among three continuous variables (i.e., BMI, AST and AFP), AFP had the highest TAAT score for the entire cohort. In subgroup analyses, AFP had the highest TAAT score except for Child-Pugh B and female among three continuous variables. In continuous variables, AFP could have higher predictive accuracy for survival in HCC patients undergoing sorafenib therapy.
Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G
2018-03-01
Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.
Receiver operating characteristics of perceptrons : Influence of sample size and prevalence
Freking, Ansgar; Biehl, Michael; Braun, Christian; Kinzel, Wolfgang; Meesmann, Malte
1999-01-01
In many practical classification problems it is important to distinguish false positive from false negative results when evaluating the performance of the classifier. This is of particular importance for medical diagnostic tests. In this context, receiver operating characteristic (ROC) curves have
International Nuclear Information System (INIS)
DALE, R.N.
2000-01-01
A system to accommodate the removal of long-length contaminated equipment (LLCE) from Hanford underground radioactive waste storage tanks was designed, procured, and demonstrated, via a project activity during the 1990s. The system is the Long Length Contaminated Equipment Removal System (LLCERS). LLCERS will be maintained and operated by Tank Farms Engineering and Operations organizations and other varied projects having a need for the system. The responsibility for the operation and maintenance of the LLCERS Receiver Trailer (RT) and Transport Trailer (TT) resides with the RPP Characterization Project Operations organization. The purpose of this document is to provide vendor supplied operating and maintenance (O and M) information for the RT and TT in a readily retrievable form. This information is provided this way instead of in a vendor information (VI) file to maintain configuration control of the operations baseline as described in RPP-6085, ''Configuration Management Plan for Long Length Contaminated Equipment Receiver and Transport Trailers''. Additional Operations Baseline documents are identified in RPP-6085
Energy Technology Data Exchange (ETDEWEB)
DALE, R.N.
2000-05-01
A system to accommodate the removal of long-length contaminated equipment (LLCE) from Hanford underground radioactive waste storage tanks was designed, procured, and demonstrated, via a project activity during the 1990s. The system is the Long Length Contaminated Equipment Removal System (LLCERS). LLCERS will be maintained and operated by Tank Farms Engineering and Operations organizations and other varied projects having a need for the system. The responsibility for the operation and maintenance of the LLCERS Receiver Trailer (RT) and Transport Trailer (TT) resides with the RPP Characterization Project Operations organization. The purpose of this document is to provide vendor supplied operating and maintenance (O & M) information for the RT and TT in a readily retrievable form. This information is provided this way instead of in a vendor information (VI) file to maintain configuration control of the operations baseline as described in RPP-6085, ''Configuration Management Plan for Long Length Contaminated Equipment Receiver and Transport Trailers''. Additional Operations Baseline documents are identified in RPP-6085.
Su, Chiu-Wen; Yen, Amy Ming-Fang; Lai, Hongmin; Chen, Hsiu-Hsi; Chen, Sam Li-Sheng
2017-12-01
The accuracy of a prediction model for periodontal disease using the community periodontal index (CPI) has been undertaken by using an area under a receiver operating characteristics (AUROC) curve. How the uncalibrated CPI, as measured by general dentists trained by periodontists in a large epidemiologic study, and affects the performance in a prediction model, has not been researched yet. A two-stage design was conducted by first proposing a validation study to calibrate CPI between a senior periodontal specialist and trained general dentists who measured CPIs in the main study of a nationwide survey. A Bayesian hierarchical logistic regression model was applied to estimate the non-updated and updated clinical weights used for building up risk scores. How the calibrated CPI affected performance of the updated prediction model was quantified by comparing AUROC curves between the original and updated models. Estimates regarding calibration of CPI obtained from the validation study were 66% and 85% for sensitivity and specificity, respectively. After updating, clinical weights of each predictor were inflated, and the risk score for the highest risk category was elevated from 434 to 630. Such an update improved the AUROC performance of the two corresponding prediction models from 62.6% (95% confidence interval [CI]: 61.7% to 63.6%) for the non-updated model to 68.9% (95% CI: 68.0% to 69.6%) for the updated one, reaching a statistically significant difference (P prediction model was demonstrated for periodontal disease as measured by the calibrated CPI derived from a large epidemiologic survey.
Jeong, Young Mi; Lee, Eunsook; Kim, Kwang-Il; Chung, Jee Eun; In Park, Hae; Lee, Byung Koo; Gwak, Hye Sun
2016-07-07
Older patients undergoing surgery tend to have a higher frequency of delirium. Delirium is strongly associated with poor surgical outcomes. This study evaluated the association between pre-operative medication use and post-operative delirium (POD) in surgical oncology patients receiving comprehensive geriatric assessment (CGA). A total of 475 patients who were scheduled for cancer surgery and received CGA from January 2014 to June 2015 were included. Pre-operative medication review through CGA was conducted on polypharmacy (≥5 medications), delirium-inducing medications (DIMs), fall-inducing medications (FIMs), and potentially inappropriate medications (PIMs). POD was confirmed by psychiatric consultation, and DSM-V criteria were used for diagnosing delirium. The model fit of the prediction model was assessed by computing the Hosmer-Lemeshow goodness-of-fit test. Effect size was measured using the Nagelkerke R(2). Discrimination of the model was assessed by an analysis of the area under receiver operating curve (AUROC). Two models were constructed for multivariate analysis based on univariate analysis; model I included dementia and DIM in addition to age and sex, and model II included PIM instead of DIM of model I. Every one year increase of age increased the risk of POD by about 1.1-fold. DIM was a significant factor for POD after adjusting for confounders (AOR 12.78, 95 % CI 2.83-57.74). PIM was also a significant factor for POD (AOR 5.53, 95 % CI 2.03-15.05). The Hosmer-Lemeshow test results revealed good fits for both models (χ(2) = 3.842, p = 0.871 for model I and χ(2) = 8.130, p = 0.421 for model II). The Nagelkerke R(2) effect size and AUROC for model I was 0.215 and 0.833, respectively. Model II had the Nagelkerke R(2)effect size of 0.174 and AUROC of 0.819. These results suggest that pharmacists' comprehensive review for pre-operative medication use is critical for the post-operative outcomes like delirium in older patients.
Kim, Minsoo; Kim, Yejin; Kim, Hyosoo; Piao, Wenhua; Kim, Changwon
2016-06-01
An operator decision support system (ODSS) is proposed to support operators of wastewater treatment plants (WWTPs) in making appropriate decisions. This system accounts for water quality (WQ) variations in WWTP influent and effluent and in the receiving water body (RWB). The proposed system is comprised of two diagnosis modules, three prediction modules, and a scenario-based supporting module (SSM). In the diagnosis modules, the WQs of the influent and effluent WWTP and of the RWB are assessed via multivariate analysis. Three prediction modules based on the k-nearest neighbors (k-NN) method, activated sludge model no. 2d (ASM2d) model, and QUAL2E model are used to forecast WQs for 3 days in advance. To compare various operating alternatives, SSM is applied to test various predetermined operating conditions in terms of overall oxygen transfer coefficient (Kla), waste sludge flow rate (Qw), return sludge flow rate (Qr), and internal recycle flow rate (Qir). In the case of unacceptable total phosphorus (TP), SSM provides appropriate information for the chemical treatment. The constructed ODSS was tested using data collected from Geumho River, which was the RWB, and S WWTP in Daegu City, South Korea. The results demonstrate the capability of the proposed ODSS to provide WWTP operators with more objective qualitative and quantitative assessments of WWTP and RWB WQs. Moreover, the current study shows that ODSS, using data collected from the study area, can be used to identify operational alternatives through SSM at an integrated urban wastewater management level.
Learning Bayesian networks for discrete data
Liang, Faming; Zhang, Jian
2009-01-01
Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly
International Nuclear Information System (INIS)
Morioka, C.; Brown, K.; Dalter, S.; Milos, M.J.; Huang, H.K.; Kangarloo, H.; Boechat, I.M.; Batra, P.
1988-01-01
Receiver operating characteristic is used to compare the image quality of films obtained digitally using computed radiography (CR) and conventionally using analog film following fluoroscopic examination. Twenty-four cases, some with a solitary noncalcified nodule and/or pneumothorax, were collected. Ten radiologists have been tested viewing analog and CR digital films separately. Preliminary results indicate that there is no significant difference in the ability to detect either a pneumothorax or a solitary noncalcified nodule when comparing CR digital film with conventional analog film. A comparison of the CR digital image displayed on a 2,048-line monitor against analog and CR digital film is in progress
International Nuclear Information System (INIS)
1987-01-01
The DOE Office of Storage and Transportation Systems is responsible for the development and management of a transportation system to provide all the necessary services for the transportation of the spent fuel and wastes from reactor sites to repositories. DOE/ORO has requested Oak Ridge Associated Universities (ORAU) to assist DOE in developing rosters of sources of transportation expertise in: (1) carrier operations; (2) transportation management, planning, and logistics; (3) transportation equipment; (4) transportation facilities design and operation; (5) vehicle safety; and (6) transportation operations quality assurance; as related to truck, rail, barge, and intermodal transportation. Persons or organizations with experience in shipping of non-hazardous materials, spent nuclear fuel, other radioactive materials, and/or other hazardous materials were included in the information system. A mailed inquiry was sent to over 2300 potential sources of transportation expertise. Responses were received from 207 persons and 254 organizations. Section 1 contains the identification numbers of the individuals and organizations that responded. Section 2 contains identification codes, names, addresses, and phone numbers of each of the individual and organization respondents. The reader can refer to Section 2 for the name and address of the respondents for the identification codes listed for each technical area/experience base in Section 1
Obuchowski, Nancy A.; Bullen, Jennifer A.
2018-04-01
Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.
Xu, Lili; Luo, Shuqian
2010-11-01
Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.
Receiver Operator Characteristic Tools Graphic User Interface Extension for R Commander
Directory of Open Access Journals (Sweden)
Daniel Corneliu LEUCUŢA
2015-12-01
Full Text Available Background: Receiver Operator Characteristic (ROC curve, is a graphical plot which presents the performance of a binary classifier when the discrimination cutoff is varied. The aim of this work was to create an extension for R Commander that offers a graphical user interface for Receiver Operator Characteristic tools provided by several existing command line accessible packages like pROC and ROCR. Material and Methods: The extension was built and tested with R version 3.2.0 and R Commander 2.1-7. Results: We built an extension called RcmdrPlugin.ROC that we uploaded on the CRAN servers. The extension adds a new menu called ROC, along with two submenus pROC and ROCR that broadly corresponds to commands available to access the functions of these packages. The pROC menu offers several commands: to plot a ROC curve for a dataset or for a logistic regression model, to compare paired and unpaired ROC curves, each providing the following tabs: General (to select the variables for the analysis, and options for switching cases with controls; Smoothing (allowing the user to select different types of smoothing – binominal, density, distributions like normal, lognormal, ...; AUC (to specify the partial area under the curve (AUC options, CI (to select the options of confidence intervals (CI – the level, computing method: DeLong, bootstrap, ...; Plot (for the plotting options. The ROCR dialogue window offers more options in choosing the performance measures for the plot. Conclusion: The RcmdrPlugin.ROC extension helps less advanced users of R accessing ROC tools in a friendly graphical user interface.
25 CFR 47.3 - How does a Bureau-operated school find out how much funding it will receive?
2010-04-01
... EDUCATION UNIFORM DIRECT FUNDING AND SUPPORT FOR BUREAU-OPERATED SCHOOLS § 47.3 How does a Bureau-operated school find out how much funding it will receive? The Office of Indian Education Programs (OIEP) will... 25 Indians 1 2010-04-01 2010-04-01 false How does a Bureau-operated school find out how much...
STUDY ON THE ACCOUNTING OF ADJUSTMENTS FOR RECEIVABLES DEPRECIATION IN OPERATIONS WITH CLIENTS
Directory of Open Access Journals (Sweden)
ILIE RĂSCOLEAN
2015-12-01
Full Text Available The present work approaches an important topic for credit institutions: limiting the credit risk and determining the prudential value adjustments for the depreciation of client credits, and their reflection in the accounting statements. The Chart of Accounts applicable to credit institutions includes class-two accounts Operations with clients and expenses and revenues accounts, which help record the adjustments for receivables depreciation in operations with clients. To constitute, diminish or annul depreciations, credit institutions have the obligation to assure the classification of their credits based on classification categories according to three criteria: financial performance, debt service and initiation of juridical procedures. By the present study, we present the way the credit portfolio is structured according to the criteria mentioned and the reflection in accounting of the prudential adjustments of value. The article ends with the authors’ conclusions regarding the specificity of the way of classification of the credits according to risk categories in agreement to the evolution of the class determination criteria.
DEFF Research Database (Denmark)
Jensen, Kasper Lynge; Toftum, Jørn; Friis-Hansen, Peter
2009-01-01
A Bayesian Network approach has been developed that can compare different building designs by estimating the effects of the thermal indoor environment on the mental performance of office workers. A part of this network is based on the compilation of subjective thermal sensation data and the assoc...
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
Receiver operating characteristic curve evaluation on computed radiography: an experimental study
International Nuclear Information System (INIS)
Yu Zixi; Wang Changyuan; Xu Yue; Xie Jindong; Zhang Menglong; Wang Jian
2003-01-01
Objective: To get the maximum information from computed radiography (CR) output images by changing post-processing parameters. Methods: Six experimental photos of polypropylene balls of 2.0 mm in diameter were taken by means of one time X-ray exposing on an imaging plate (IP) with different post-processing parameters including rotation amount (GA), gray gradation type (GT), rotation center (GC), shifting amount (GS), frequency rank (RN), frequency type (RT), and degree of enhancement (RE). 6 photos were viewed by three students and one radiologist on a 6000 lx illuminance viewbox. Receiver operating characteristic (ROC) curves were made by means of 5-value-differentiation method. Results: The largest mean area value (Az) below ROC curves of a low contrast experimental photo with post-processing parameters GA=1.0, GT=A, GC=1.6, GS=0.3, RN=4.0, RT=R and RE=3.0 was 0.96, and the maximum information was obtained. The smallest mean area value (Az) was 0.78 with changed post-processing parameters GA=0.8, GS=-0.2 and RE=0.5 while other parameters were not changed. The minimum information was obtained from this photo. Conclusion: In order to get the maximum information from a CR output image, the post-processing parameters should be suitably selected
Linden, Ariel
2006-04-01
Diagnostic or predictive accuracy concerns are common in all phases of a disease management (DM) programme, and ultimately play an influential role in the assessment of programme effectiveness. Areas, such as the identification of diseased patients, predictive modelling of future health status and costs and risk stratification, are just a few of the domains in which assessment of accuracy is beneficial, if not critical. The most commonly used analytical model for this purpose is the standard 2 x 2 table method in which sensitivity and specificity are calculated. However, there are several limitations to this approach, including the reliance on a single defined criterion or cut-off for determining a true-positive result, use of non-standardized measurement instruments and sensitivity to outcome prevalence. This paper introduces the receiver operator characteristic (ROC) analysis as a more appropriate and useful technique for assessing diagnostic and predictive accuracy in DM. Its advantages include; testing accuracy across the entire range of scores and thereby not requiring a predetermined cut-off point, easily examined visual and statistical comparisons across tests or scores, and independence from outcome prevalence. Therefore the implementation of ROC as an evaluation tool should be strongly considered in the various phases of a DM programme.
Design of a receiver operating characteristic (ROC) study of 10:1 lossy image compression
Collins, Cary A.; Lane, David; Frank, Mark S.; Hardy, Michael E.; Haynor, David R.; Smith, Donald V.; Parker, James E.; Bender, Gregory N.; Kim, Yongmin
1994-04-01
The digital archiving system at Madigan Army Medical Center (MAMC) uses a 10:1 lossy data compression algorithm for most forms of computed radiography. A systematic study on the potential effect of lossy image compression on patient care has been initiated with a series of studies focused on specific diagnostic tasks. The studies are based upon the receiver operating characteristic (ROC) method of analysis for diagnostic systems. The null hypothesis is that observer performance with approximately 10:1 compressed and decompressed images is not different from using original, uncompressed images for detecting subtle pathologic findings seen on computed radiographs of bone, chest, or abdomen, when viewed on a high-resolution monitor. Our design involves collecting cases from eight pathologic categories. Truth is determined by committee using confirmatory studies performed during routine clinical practice whenever possible. Software has been developed to aid in case collection and to allow reading of the cases for the study using stand-alone Siemens Litebox workstations. Data analysis uses two methods, ROC analysis and free-response ROC (FROC) methods. This study will be one of the largest ROC/FROC studies of its kind and could benefit clinical radiology practice using PACS technology. The study design and results from a pilot FROC study are presented.
Thompson, J; Hogg, P; Thompson, S; Manning, D; Szczepura, K
2012-01-01
ROCView has been developed as an image display and response capture (IDRC) solution to image display and consistent recording of reader responses in relation to the free-response receiver operating characteristic paradigm. A web-based solution to IDRC for observer response studies allows observations to be completed from any location, assuming that display performance and viewing conditions are consistent with the study being completed. The simplistic functionality of the software allows observations to be completed without supervision. ROCView can display images from multiple modalities, in a randomised order if required. Following registration, observers are prompted to begin their image evaluation. All data are recorded via mouse clicks, one to localise (mark) and one to score confidence (rate) using either an ordinal or continuous rating scale. Up to nine “mark-rating” pairs can be made per image. Unmarked images are given a default score of zero. Upon completion of the study, both true-positive and false-positive reports can be downloaded and adapted for analysis. ROCView has the potential to be a useful tool in the assessment of modality performance difference for a range of imaging methods. PMID:22573294
Receiver operating characteristic analysis of age-related changes in lineup performance.
Humphries, Joyce E; Flowe, Heather D
2015-04-01
In the basic face memory literature, support has been found for the late maturation hypothesis, which holds that face recognition ability is not fully developed until at least adolescence. Support for the late maturation hypothesis in the criminal lineup identification literature, however, has been equivocal because of the analytic approach that has been used to examine age-related changes in identification performance. Recently, receiver operator characteristic (ROC) analysis was applied for the first time in the adult eyewitness memory literature to examine whether memory sensitivity differs across different types of lineup tests. ROC analysis allows for the separation of memory sensitivity from response bias in the analysis of recognition data. Here, we have made the first ROC-based comparison of adults' and children's (5- and 6-year-olds and 9- and 10-year-olds) memory performance on lineups by reanalyzing data from Humphries, Holliday, and Flowe (2012). In line with the late maturation hypothesis, memory sensitivity was significantly greater for adults compared with young children. Memory sensitivity for older children was similar to that for adults. The results indicate that the late maturation hypothesis can be generalized to account for age-related performance differences on an eyewitness memory task. The implications for developmental eyewitness memory research are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.
Mickes, Laura; Flowe, Heather D; Wixted, John T
2012-12-01
A police lineup presents a real-world signal-detection problem because there are two possible states of the world (the suspect is either innocent or guilty), some degree of information about the true state of the world is available (the eyewitness has some degree of memory for the perpetrator), and a decision is made (identifying the suspect or not). A similar state of affairs applies to diagnostic tests in medicine because, in a patient, the disease is either present or absent, a diagnostic test yields some degree of information about the true state of affairs, and a decision is made about the presence or absence of the disease. In medicine, receiver operating characteristic (ROC) analysis is the standard method for assessing diagnostic accuracy. By contrast, in the eyewitness memory literature, this powerful technique has never been used. Instead, researchers have attempted to assess the diagnostic performance of different lineup procedures using methods that cannot identify the better procedure (e.g., by computing a diagnosticity ratio). Here, we describe the basics of ROC analysis, explaining why it is needed and showing how to use it to measure the performance of different lineup procedures. To illustrate the unique advantages of this technique, we also report 3 ROC experiments that were designed to investigate the diagnostic accuracy of simultaneous versus sequential lineups. According to our findings, the sequential procedure appears to be inferior to the simultaneous procedure in discriminating between the presence versus absence of a guilty suspect in a lineup.
Slotnick, Scott D; Jeye, Brittany M; Dodson, Chad S
2016-01-01
Is recollection a continuous/graded process or a threshold/all-or-none process? Receiver operating characteristic (ROC) analysis can answer this question as the continuous model and the threshold model predict curved and linear recollection ROCs, respectively. As memory for plurality, an item's previous singular or plural form, is assumed to rely on recollection, the nature of recollection can be investigated by evaluating plurality memory ROCs. The present study consisted of four experiments. During encoding, words (singular or plural) or objects (single/singular or duplicate/plural) were presented. During retrieval, old items with the same plurality or different plurality were presented. For each item, participants made a confidence rating ranging from "very sure old", which was correct for same plurality items, to "very sure new", which was correct for different plurality items. Each plurality memory ROC was the proportion of same versus different plurality items classified as "old" (i.e., hits versus false alarms). Chi-squared analysis revealed that all of the plurality memory ROCs were adequately fit by the continuous unequal variance model, whereas none of the ROCs were adequately fit by the two-high threshold model. These plurality memory ROC results indicate recollection is a continuous process, which complements previous source memory and associative memory ROC findings.
Quilty, Lena C; Avila Murati, Daniela; Bagby, R Michael
2014-03-01
Many gamblers would prefer to reduce gambling on their own rather than to adopt an abstinence approach within the context of a gambling treatment program. Yet responsible gambling guidelines lack quantifiable markers to guide gamblers in wagering safely. To address these issues, the current investigation implemented receiver operating characteristic (ROC) analysis to identify behavioral indicators of harmful and problem gambling. Gambling involvement was assessed in 503 participants (275 psychiatric outpatients and 228 community gamblers) with the Canadian Problem Gambling Index. Overall gambling frequency, duration, and expenditure were able to distinguish harmful and problematic gambling at a moderate level. Indicators of harmful gambling were generated for engagement in specific gambling activities: frequency of tickets and casino; duration of bingo, casino, and investments; and expenditures on bingo, casino, sports betting, games of skill, and investments. Indicators of problem gambling were similarly produced for frequency of tickets and casino, and expenditures on bingo, casino, games of skill, and investments. Logistic regression analyses revealed that overall gambling frequency uniquely predicted the presence of harmful and problem gambling. Furthermore, frequency indicators for tickets and casino uniquely predicted the presence of both harmful and problem gambling. Together, these findings contribute to the development of an empirically based method enabling the minimization of harmful or problem gambling through self-control rather than abstinence.
Bredemeier, Keith; Spielberg, Jeffrey M.; Silton, Rebecca Levin; Berenbaum, Howard; Heller, Wendy; Miller, Gregory A.
2010-01-01
The present study examined the utility of the anhedonic depression scale from the Mood and Anxiety Symptoms Questionnaire (MASQ-AD) as a way to screen for depressive disorders. Using receiver-operator characteristic analysis, the sensitivity and specificity of the full 22-item MASQ-AD scale, as well as the 8 and 14-item subscales, were examined in relation to both current and lifetime DSM-IV depressive disorder diagnoses in two nonpatient samples. As a means of comparison, the sensitivity and specificity of a measure of a relevant personality dimension, neuroticism, was also examined. Results from both samples support the clinical utility of the MASQ-AD scale as a means of screening for depressive disorders. Findings were strongest for the MASQ-AD 8-item subscale and when predicting current depression status. Furthermore, the MASQ-AD 8-item subscale outperformed the neuroticism measure under certain conditions. The overall usefulness of the MASQ-AD scale as a screening device is discussed, as well as possible cutoff scores for use in research. PMID:20822283
Directory of Open Access Journals (Sweden)
Heidi L. Weiss
2004-01-01
Full Text Available The role of biomarkers in disease prognosis continues to be an important investigation in many cancer studies. In order for these biomarkers to have practical application in clinical decision making regarding patient treatment and follow-up, it is common to dichotomize patients into those with low vs. high expression levels. In this study, receiver operating characteristic (ROC curves, area under the curve (AUC of the ROC, sensitivity, specificity, as well as likelihood ratios were calculated to determine levels of growth factor biomarkers that best differentiate lung cancer cases versus control subjects. Selected cut-off points for p185erbB-2 and EGFR membrane appear to have good discriminating power to differentiate control tissues versus uninvolved tissues from patients with lung cancer (AUC = 89% and 90%, respectively; while AUC increased to at least 90% for selected cut-off points for p185erbB-2 membrane, EGFR membrane, and FASE when comparing between control versus carcinoma tissues from lung cancer cases. Using data from control subjects compared to patients with lung cancer, we presented a simple and intuitive approach to determine dichotomized levels of biomarkers and validated the value of these biomarkers as surrogate endpoints for cancer outcome.
Receiver operating characteristic analysis of regional cerebral blood flow in Alzheimer's disease
International Nuclear Information System (INIS)
Zemcov, A.; Barclay, L.L.; Sansone, J.; Metz, C.E.
1985-01-01
Receiver operating characteristic (ROC) curves were used to quantitatively assess the ability of individual detectors in a 32-detector 133 Xe inhalation system to discriminate between two populations over the range of regional cerebral blood flow (rCBF) values. These populations were clinically evaluated as normal (age 63.1 +/- 13.1, n = 23) and presumed Alzheimer's disease (age 72.7 +/- 7.0, n = 82). Summary statistics showed that for homologous detectors the average value of blood flow in the normal group was greater than the flow value in the group of subjects with Alzheimer's disease. Conclusions drawn from single values of flow or mean hemispheric flow can lead to erroneous conclusions about hemisphere asymmetries. However, the dynamic relationship between the correct identifications (true positives) compared with incorrect identifications (false positives) of Alzheimer's disease at each detector varies over the range of blood flow values, and quantitative characterization of this relationship in terms of an ROC curve provides more insight into the structure of the data. Detectors approximating the speech, auditory and association cortex were most effective in discriminating between groups. Frontal detectors were marginally useful diagnostically
DEFF Research Database (Denmark)
Liu, Dedi; Li, Xiang; Guo, Shenglian
2015-01-01
Dynamic control of the flood limiting water level (FLWL) is a valuable and effective way to maximize the benefits from reservoir operation without exceeding the design risk. In order to analyze the impacts of input uncertainty, a Bayesian forecasting system (BFS) is adopted. Applying quantile water...... inflow values and their uncertainties obtained from the BFS, the reservoir operation results from different schemes can be analyzed in terms of benefits, dam safety, and downstream impacts during the flood season. When the reservoir FLWL dynamic control operation is implemented, there are two fundamental......, also deterministic water inflow was tested. The proposed model in the paper emphasizes the importance of analyzing the uncertainties of the water inflow forecasting system for real-time dynamic control of the FLWL for reservoir operation. For the case study, the selected quantile inflow from...
ZHOU, Lin
1996-01-01
In this paper I consider social choices under uncertainty. I prove that any social choice rule that satisfies independence of irrelevant alternatives, translation invariance, and weak anonymity is consistent with ex post Bayesian utilitarianism
International Nuclear Information System (INIS)
Lai, C J; Shaw, Chris C; Whitman, Gary J; Yang, Wei T; Dempsey, Peter J; Nguyen, Victoria; Ice, Mary F
2006-01-01
The aim of this study was to compare mammography systems based on three different detectors-a conventional screen-film (SF) combination, an a-Si/CsI flat-panel (FP)-based detector, and a charge-coupled device (CCD)-based x-ray phosphor-based detector-for their performance in detecting simulated microcalcifications (MCs). 112-150 μm calcium carbonate grains were used to simulate MCs and were overlapped with a slab phantom of simulated 50% adipose/50% glandular breast tissue-equivalent material referred to as the uniform background. For the tissue structure background, 200-250 μm calcium carbonate grains were used and overlapped with an anthropomorphic breast phantom. All MC phantom images were acquired with and without magnification (1.8X). The hardcopy images were reviewed by five mammographers. A five-point confidence level rating was used to score each detection task. Receiver operating characteristic (ROC) analysis was performed, and the areas under the ROC curves (A z s) were used to compare the performances of the three mammography systems under various conditions. The results showed that, with a uniform background and contact images, the FP-based system performed significantly better than the SF and the CCD-based systems. For magnified images with a uniform background, the SF and the FP-based systems performed equally well and significantly better than the CCD-based system. With tissue structure background and contact images, the SF system performed significantly better than the FP and the CCD-based systems. With magnified images and a tissue structure background, the SF and the CCD-based systems performed equally well and significantly better than the FP-based system. In the detection of MCs in the fibroglandular and the heterogeneously dense regions, no significant differences were found except that the SF system performed significantly better than the CCD-based system in the fibroglandular regions for the contact images
Koen, Joshua D; Barrett, Frederick S; Harlow, Iain M; Yonelinas, Andrew P
2017-08-01
Signal-detection theory, and the analysis of receiver-operating characteristics (ROCs), has played a critical role in the development of theories of episodic memory and perception. The purpose of the current paper is to present the ROC Toolbox. This toolbox is a set of functions written in the Matlab programming language that can be used to fit various common signal detection models to ROC data obtained from confidence rating experiments. The goals for developing the ROC Toolbox were to create a tool (1) that is easy to use and easy for researchers to implement with their own data, (2) that can flexibly define models based on varying study parameters, such as the number of response options (e.g., confidence ratings) and experimental conditions, and (3) that provides optimal routines (e.g., Maximum Likelihood estimation) to obtain parameter estimates and numerous goodness-of-fit measures.The ROC toolbox allows for various different confidence scales and currently includes the models commonly used in recognition memory and perception: (1) the unequal variance signal detection (UVSD) model, (2) the dual process signal detection (DPSD) model, and (3) the mixture signal detection (MSD) model. For each model fit to a given data set the ROC toolbox plots summary information about the best fitting model parameters and various goodness-of-fit measures. Here, we present an overview of the ROC Toolbox, illustrate how it can be used to input and analyse real data, and finish with a brief discussion on features that can be added to the toolbox.
ROC-ing along: Evaluation and interpretation of receiver operating characteristic curves.
Carter, Jane V; Pan, Jianmin; Rai, Shesh N; Galandiuk, Susan
2016-06-01
It is vital for clinicians to understand and interpret correctly medical statistics as used in clinical studies. In this review, we address current issues and focus on delivering a simple, yet comprehensive, explanation of common research methodology involving receiver operating characteristic (ROC) curves. ROC curves are used most commonly in medicine as a means of evaluating diagnostic tests. Sample data from a plasma test for the diagnosis of colorectal cancer were used to generate a prediction model. These are actual, unpublished data that have been used to describe the calculation of sensitivity, specificity, positive predictive and negative predictive values, and accuracy. The ROC curves were generated to determine the accuracy of this plasma test. These curves are generated by plotting the sensitivity (true-positive rate) on the y axis and 1 - specificity (false-positive rate) on the x axis. Curves that approach closest to the coordinate (x = 0, y = 1) are more highly predictive, whereas ROC curves that lie close to the line of equality indicate that the result is no better than that obtained by chance. The optimum sensitivity and specificity can be determined from the graph as the point where the minimum distance line crosses the ROC curve. This point corresponds to the Youden index (J), a function of sensitivity and specificity used commonly to rate diagnostic tests. The area under the curve is used to quantify the overall ability of a test to discriminate between 2 outcomes. By following these simple guidelines, interpretation of ROC curves will be less difficult and they can then be interpreted more reliably when writing, reviewing, or analyzing scientific papers. Copyright © 2016 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Gabriella Ferruzzi
2013-02-01
Full Text Available A new short-term probabilistic forecasting method is proposed to predict the probability density function of the hourly active power generated by a photovoltaic system. Firstly, the probability density function of the hourly clearness index is forecasted making use of a Bayesian auto regressive time series model; the model takes into account the dependence of the solar radiation on some meteorological variables, such as the cloud cover and humidity. Then, a Monte Carlo simulation procedure is used to evaluate the predictive probability density function of the hourly active power by applying the photovoltaic system model to the random sampling of the clearness index distribution. A numerical application demonstrates the effectiveness and advantages of the proposed forecasting method.
Riggio, V.; Pesce, L.L.; Morreale, S.; Portolano, B.
2013-01-01
Using receiver-operating characteristic (ROC) curve methodology this study was designed to assess the diagnostic effectiveness of somatic cell count (SCC) and the California mastitis test (CMT) in Valle del Belice sheep, and to propose and evaluate threshold values for those tests that would
Bredemeier, Keith; Spielberg, Jeffery M.; Silton, Rebecca Levin; Berenbaum, Howard; Heller, Wendy; Miller, Gregory A.
2010-01-01
The present study examined the utility of the anhedonic depression scale from the Mood and Anxiety Symptoms Questionnaire (MASQ-AD scale) as a way to screen for depressive disorders. Using receiver-operating characteristic analysis, we examined the sensitivity and specificity of the full 22-item MASQ-AD scale, as well as the 8- and 14-item…
The functional improvement and reduction of operators' work at LNG receiving terminal
International Nuclear Information System (INIS)
Tomiyama, H.
1997-01-01
The Tokyo Gas Negishi Terminal has undergone a series of major changes since starting operation in 1966, including a change in the main feedstock from oil to LNG, and expansion of processing volume and scale. Control of the terminal has been in the form of centralized control and monitoring from a central control room. High technical levels have been maintained, this being one of the first terminals to adopt direct digital control (DDC) as the technology became available. In 1995, a distributed control system (DCS) was introduced as part of a large-scale redevelopment project at the Negishi Terminal, extending the scope of operations and monitoring by operators by full automation of controls, and improvement of functions including integration and upgrading of monitoring. The result has been a significant reduction in the workload on operators. The installation of these functions required further investment of around 1 billion yen, in addition to the cost of renewal of the facility. In spite of the major expansion of the range of facilities under control, the number of operators working 24-hours shifts has been reduced, and over 15 years cost reductions equivalent to around twice the investment cost are expected to be made. (au)
International Nuclear Information System (INIS)
Le Roux, W.G.; Bello-Ochende, T.; Meyer, J.P.
2011-01-01
The small-scale open and direct solar thermal Brayton cycle with recuperator has several advantages, including low cost, low operation and maintenance costs and it is highly recommended. The main disadvantages of this cycle are the pressure losses in the recuperator and receiver, turbomachine efficiencies and recuperator effectiveness, which limit the net power output of such a system. The irreversibilities of the solar thermal Brayton cycle are mainly due to heat transfer across a finite temperature difference and fluid friction. In this paper, thermodynamic optimisation is applied to concentrate on these disadvantages in order to optimise the receiver and recuperator and to maximise the net power output of the system at various steady-state conditions, limited to various constraints. The effects of wind, receiver inclination, rim angle, atmospheric temperature and pressure, recuperator height, solar irradiance and concentration ratio on the optimum geometries and performance were investigated. The dynamic trajectory optimisation method was applied. Operating points of a standard micro-turbine operating at its highest compressor efficiency and a parabolic dish concentrator diameter of 16 m were considered. The optimum geometries, minimum irreversibility rates and maximum receiver surface temperatures of the optimised systems are shown. For an environment with specific conditions and constraints, there exists an optimum receiver and recuperator geometry so that the system produces maximum net power output. -- Highlights: → Optimum geometries exist such that the system produces maximum net power output. → Optimum operating conditions are shown. → Minimum irreversibility rates and minimum entropy generation rates are shown. → Net power output was described in terms of total entropy generation rate. → Effects such as wind, recuperator height and irradiance were investigated.
Bibliographic study of doses received by operators with non-protected organs
International Nuclear Information System (INIS)
Aubert, B.; Rehel, J.L.
2009-01-01
Based on a literature survey, the authors present and discuss the various levels of doses received by different organs during interventional radiology. These doses depend on the procedures (close or remote), on the part of the body, on the protocols, and on the apparatus. They raise the issue of the use of a single dosimeter (born under protection as it is already required in France) or two dosimeters (a second one on a non protected area as it is the case in some other countries)
Detector evaluation for improved situational awareness: Receiver operator characteristic curve based
Wuijckhuijse, A.L. van; Nieuwenhuizen, M.S.
2016-01-01
In military and civilian operations good situational awareness is a prerequisite to make proper decisions. The situational awareness is among others based upon intelligence, threat analysis and detection, altogether element of the so-called DIM (detection, identification, monitoring) system. In case
2011-07-25
35 5.10 Signal Strength Telemetry Output ( SSTO ) (Test Number 10) ............ 38 5.11 Operational...All command outputs and monitor outputs shall respond properly. TOP 05-2-543 25 July 2011 38 5.10 Signal Strength Telemetry Output ( SSTO ...Test Number 10). a. Purpose. This test verifies that the signal strength telemetry output ( SSTO ) voltage is monotonic and directly related to the
Approximate Bayesian recursive estimation
Czech Academy of Sciences Publication Activity Database
Kárný, Miroslav
2014-01-01
Roč. 285, č. 1 (2014), s. 100-111 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf
Part 5: Receiver Operating Characteristic Curve and Area under the Curve
Directory of Open Access Journals (Sweden)
Saeed Safari
2016-04-01
Full Text Available Multiple diagnostic tools are used by emergency physicians,every day. In addition, new tools are evaluated to obtainmore accurate methods and reduce time or cost of conventionalones. In the previous parts of this educationalseries, we described diagnostic performance characteristicsof diagnostic tests including sensitivity, specificity, positiveand negative predictive values, and likelihood ratios. Thereceiver operating characteristics (ROC curve is a graphicalpresentation of screening characteristics. ROC curve is usedto determine the best cutoff point and compare two or moretests or observers by measuring the area under the curve(AUC. In this part of our educational series, we explain ROCcurve and two methods to determine the best cutoff value.
International Nuclear Information System (INIS)
1976-01-01
The proposed action is to issue a materials license, pursuant to 10 CFR Parts 30, 40 and 70 of the Commission's regulations, authorizing Allied-General Nuclear Services to receive and handle fuel casks containing spent reactor fuel elements and to store spent reactor fuel at the Barnwell Nuclear Fuel Plant (BNFP), in the Barnwell Fuel Receiving and Storage Station (BFRSS). The BFRSS is a part of, and contiguous to, the BNFP-Separations Facility which is being constructed on a small portion of a 1700 acre site about six miles west of the city of Barnwell in Barnwell County, South Carolina. Construction of the BFRSS facility has been completed and the BNFP Separations Facility is more than 90% complete. A uranium Hexafluoride Facility is being constructed on the same site, and a Plutonium Product Facility is proposed to be constructed adjacent to the Separations Facility. The license that is the subject of this action will, if issued, allow lthe use of the BFRSS separate4 from the operation of the Separations Facility. Impacts resulting from the construction of the BFRSS have already occurred and mitigating measures have been and are being implemented to offset any adverse impacts. Operation of the BFRSS will not interfere with water sources, and should cause no noticeable damage to the terrestrial or aquatic environments. Operating experience at other fuel receiving and storage facilities has shown that radioactive concentrations discharged to the environs (the more significant process effluents) have been well below applicabhle state and federal limits. The small quantities to be released during operation of the BFRSS will result in negligible environmental impact. 20 figs
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
Directory of Open Access Journals (Sweden)
Hiroshi eSaito
2014-03-01
Full Text Available The decision making behaviors of humans and animals adapt and then satisfy an ``operant matching law'' in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...
Chang, Szu-Ling; Lin, Wen-Li; Weng, Chien-Hsiang; Wu, Shye-Jao; Tsai, Hsin-Jung; Wang, Shwu-Meei; Peng, Chun-Chih; Chang, Jui-Hsing
2018-04-01
Patent ductus arteriosus (PDA) is one of the most common cardiac conditions in preterm infants. Closure of the PDA in symptomatic patients can be achieved medically or surgically. Atropine is commonly administered in general anesthesia as a premedication in this age group but with limited evidence addressing the effect of its use. Our study examined the association of the use of atropine as a premedication in PDA ligation and the risk of post-operative respiratory complications. This retrospective cohort study included 150 newborns who have failed medical treatment for PDA and received PDA ligation during 2008-2012 in a single tertiary medical center. Ninety-two of them (61.3%) received atropine as premedication for general anesthesia while 58 (38.7%) did not. Post-operative respiratory condition, the need of cardiopulmonary resuscitation and the presence of bradycardia were measured. Patients with atropine use were associated with increased odds of respiratory acidosis in both univariate analysis (22.9% vs 7.3%; OR = 3.785, 95% CI = 1.211-11.826, p = 0.022) and multivariate analysis (OR = 4.030, 95% CI = 1.230-13.202, p = 0.021), with an even higher odds of respiratory acidosis in patients receiving both atropine and ketamine. The use of atropine as premedication in general anesthesia for neonatal PDA ligation is associated with higher risk of respiratory acidosis, which worsens with the combined use of ketamine. Copyright © 2017. Published by Elsevier B.V.
Energy Technology Data Exchange (ETDEWEB)
Allen, P M
1983-09-01
Analysis of the Savannah River Plant RBOF and RRF included an evaluation of the reliability of process equipment and controls, administrative controls, and engineered safety features. The evaluation also identified potential scenarios and radiological consequences. Risks were calculated in terms of 50-year population dose commitment per year (man-rem/year) to the onsite and offsite population within an 80 Km radius of RBOF and RRF, and to an individual at the plant boundary. The total 50-year onsite and offsite population radiological risks of operating the RBOF and RRF were estimated to be 1.0 man-rem/year. These risks are significantly less than the population dose of 54,000 man/rem/yr for natural background radiation in a 50-mile radius. The 50-year maximum offsite individual risk from operating the facility was estimated to be 2.1 {times} 10{sup 5} rem/yr. These risks are significantly lower than 93 mrem/yr an individual is expected to receive from natural background radiation in this area. The analysis shows. that the RBOF and RRF can be operated without undue risk to onsite personnel or to the general public.
International Nuclear Information System (INIS)
Patterson, H.; Clarke, G.H.; Lombardo, P.; McKay, W.J.; Austin and Repatriation Medical Centre, Heidelberg, VIC
1999-01-01
Full text: We outline an example of receiver operating characteristic (ROC) analysis in the assessment of image quality. ROC analysis is a measure of image quality that accounts for the consequences of the decision and the role of the observer. Kim and Haynie (Nuclear Diagnostic Imaging: Practical Clinical Applications. Melbourne: Macmillan, 1987) describe ROC analysis as an 'objective approach to the evaluation of diagnostic decision making'. ROC analysis is an ideal technique for evaluating images of a Hoffman brain phantom obtained using positron emission tomography. Images have been acquired with the phantom in different positions. The position of the phantom and the time the phantom remained in each position was based on the measurements of head movement during simulated brain imaging (Patterson et al., Technologists Symposium, ANZSNM, 1998). This study was undertaken to explore the potential of ROC analysis in determining the effect of movement on the ability to detect lesions of various sizes
2014-01-01
Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores 30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298
Liu, Pengbo; Mongelli, Max; Mondry, Adrian
2004-07-01
The purpose of this study is to verify by Receiver Operating Characteristics (ROC) a mathematical model supporting the hypothesis that IUGR can be diagnosed by estimating growth velocity. The ROC compare computerized simulation results with clinical data from 325 pregnant British women. Each patient had 6 consecutive ultrasound examinations for fetal abdominal circumference (fac). Customized and un-customized fetal weights were calculated according to Hadlock"s formula. IUGR was diagnosed by the clinical standard, i.e. estimated weight below the tenth percentile. Growth velocity was estimated by calculating the changes of fac (Dzfac/dt) at various time intervals from 3 to 10 weeks. Finally, ROC was used to compare the methods. At 3~4 weeks scan interval, the area under the ROC curve is 0.68 for customized data and 0.66 for the uncustomized data with 95% confidence interval. Comparison between simulation data and real pregnancies verified that the model is clinically acceptable.
Directory of Open Access Journals (Sweden)
Pešić Milan P.
2012-01-01
Full Text Available A numerical simulation of the radiological consequences of the RB reactor reactivity excursion accident, which occurred on October 15, 1958, and an estimation of the total doses received by the operators were run by the MCNP5 computer code. The simulation was carried out under the same assumptions as those used in the 1960 IAEA-organized experimental simulation of the accident: total fission energy of 80 MJ released in the accident and the frozen positions of the operators. The time interval of exposure to high doses received by the operators has been estimated. Data on the RB1/1958 reactor core relevant to the accident are given. A short summary of the accident scenario has been updated. A 3-D model of the reactor room and the RB reactor tank, with all the details of the core, created. For dose determination, 3-D simplified, homogenised, sexless and faceless phantoms, placed inside the reactor room, have been developed. The code was run for a number of neutron histories which have given a dose rate uncertainty of less than 2%. For the determination of radiation spectra escaping the reactor core and radiation interaction in the tissue of the phantoms, the MCNP5 code was run (in the KCODE option and “mode n p e”, with a 55-group neutron spectra, 35-group gamma ray spectra and a 10-group electron spectra. The doses were determined by using the conversion of flux density (obtained by the F4 tally in the phantoms to doses using factors taken from ICRP-74 and from the deposited energy of neutrons and gamma rays (obtained by the F6 tally in the phantoms’ tissue. A rough estimation of the time moment when the odour of ozone was sensed by the operators is estimated for the first time and given in Appendix A.1. Calculated total absorbed and equivalent doses are compared to the previously reported ones and an attempt to understand and explain the reasons for the obtained differences has been made. A Root Cause Analysis of the accident was done and
Shiraishi, Junji; Pesce, Lorenzo L.; Metz, Charles E.; Doi, Kunio
2009-01-01
Purpose: To provide a broad perspective concerning the recent use of receiver operating characteristic (ROC) analysis in medical imaging by reviewing ROC studies published in Radiology between 1997 and 2006 for experimental design, imaging modality, medical condition, and ROC paradigm. Materials and Methods: Two hundred ninety-five studies were obtained by conducting a literature search with PubMed with two criteria: publication in Radiology between 1997 and 2006 and occurrence of the phrase “receiver operating characteristic.” Studies returned by the query that were not diagnostic imaging procedure performance evaluations were excluded. Characteristics of the remaining studies were tabulated. Results: Two hundred thirty-three (79.0%) of the 295 studies reported findings based on observers' diagnostic judgments or objective measurements. Forty-three (14.6%) did not include human observers, with most of these reporting an evaluation of a computer-aided diagnosis system or functional data obtained with computed tomography (CT) or magnetic resonance (MR) imaging. The remaining 19 (6.4%) studies were classified as reviews or meta-analyses and were excluded from our subsequent analysis. Among the various imaging modalities, MR imaging (46.0%) and CT (25.7%) were investigated most frequently. Approximately 60% (144 of 233) of ROC studies with human observers published in Radiology included three or fewer observers. Conclusion: ROC analysis is widely used in radiologic research, confirming its fundamental role in assessing diagnostic performance. However, the ROC studies reported in Radiology were not always adequate to support clear and clinically relevant conclusions. © RSNA, 2009 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.2533081632/-/DC1 PMID:19864510
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
Bayesian artificial intelligence
Korb, Kevin B
2003-01-01
As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.
Bayesian image reconstruction for improving detection performance of muon tomography.
Wang, Guobao; Schultz, Larry J; Qi, Jinyi
2009-05-01
Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Evans, Kris; Rotello, Caren M; Li, Xingshan; Rayner, Keith
2009-02-01
Cultural differences have been observed in scene perception and memory: Chinese participants purportedly attend to the background information more than did American participants. We investigated the influence of culture by recording eye movements during scene perception and while participants made recognition memory judgements. Real-world pictures with a focal object on a background were shown to both American and Chinese participants while their eye movements were recorded. Later, memory for the focal object in each scene was tested, and the relationship between the focal object (studied, new) and the background context (studied, new) was manipulated. Receiver-operating characteristic (ROC) curves show that both sensitivity and response bias were changed when objects were tested in new contexts. However, neither the decrease in accuracy nor the response bias shift differed with culture. The eye movement patterns were also similar across cultural groups. Both groups made longer and more fixations on the focal objects than on the contexts. The similarity of eye movement patterns and recognition memory behaviour suggests that both Americans and Chinese use the same strategies in scene perception and memory.
Energy Technology Data Exchange (ETDEWEB)
Johnsen, Boel [Haukeland University Hospital, Centre for Nuclear Medicine and PET, Department of Radiology, P.O. Box 1400, Bergen (Norway); Fasmer, Kristine Eldevik [Haukeland University Hospital, Department of Oncology, Medical Physics Section, Bergen (Norway); Boye, Kjetil [Norwegian Radium Hospital, Oslo University Hospital, Department of Oncology, Oslo (Norway); Rosendahl, Karen; Aukland, Stein Magnus [Haukeland University Hospital, Department of Radiology, Paediatric Section, Bergen (Norway); University of Bergen, Department of Clinical Medicine, Bergen (Norway); Trovik, Clement [University of Bergen, Department of Clinical Medicine, Bergen (Norway); Haukeland University Hospital, Department of Surgery, Orthopaedic Section, Bergen (Norway); Biermann, Martin [Haukeland University Hospital, Centre for Nuclear Medicine and PET, Department of Radiology, P.O. Box 1400, Bergen (Norway); University of Bergen, Department of Clinical Medicine, Bergen (Norway)
2017-01-15
Patients with Ewing sarcoma are subject to various diagnostic procedures that incur exposure to ionising radiation. To estimate the radiation doses received from all radiologic and nuclear imaging episodes during diagnosis and treatment, and to determine whether {sup 18}F-fluorodeoxyglucose positron emission tomography - computed tomography ({sup 18}F-FDG PET-CT) is a major contributor of radiation. Twenty Ewing sarcoma patients diagnosed in Norway in 2005-2012 met the inclusion criteria (age <30 years, operable disease, uncomplicated chemotherapy and surgery, no metastasis or residual disease within a year of diagnosis). Radiation doses from all imaging during the first year were calculated for each patient. The mean estimated cumulative radiation dose for all patients was 34 mSv (range: 6-70), radiography accounting for 3 mSv (range: 0.2-12), CT for 13 mSv (range: 2-28) and nuclear medicine for 18 mSv (range: 2-47). For the patients examined with PET-CT, the mean estimated cumulative effective dose was 38 mSv, of which PET-CT accounted for 14 mSv (37%). There was large variation in number and type of examinations performed and also in estimated cumulative radiation dose. The mean radiation dose for patients examined with PET-CT was 23% higher than for patients not examined with PET-CT. (orig.)
International Nuclear Information System (INIS)
Johnsen, Boel; Fasmer, Kristine Eldevik; Boye, Kjetil; Rosendahl, Karen; Aukland, Stein Magnus; Trovik, Clement; Biermann, Martin
2017-01-01
Patients with Ewing sarcoma are subject to various diagnostic procedures that incur exposure to ionising radiation. To estimate the radiation doses received from all radiologic and nuclear imaging episodes during diagnosis and treatment, and to determine whether 18 F-fluorodeoxyglucose positron emission tomography - computed tomography ( 18 F-FDG PET-CT) is a major contributor of radiation. Twenty Ewing sarcoma patients diagnosed in Norway in 2005-2012 met the inclusion criteria (age <30 years, operable disease, uncomplicated chemotherapy and surgery, no metastasis or residual disease within a year of diagnosis). Radiation doses from all imaging during the first year were calculated for each patient. The mean estimated cumulative radiation dose for all patients was 34 mSv (range: 6-70), radiography accounting for 3 mSv (range: 0.2-12), CT for 13 mSv (range: 2-28) and nuclear medicine for 18 mSv (range: 2-47). For the patients examined with PET-CT, the mean estimated cumulative effective dose was 38 mSv, of which PET-CT accounted for 14 mSv (37%). There was large variation in number and type of examinations performed and also in estimated cumulative radiation dose. The mean radiation dose for patients examined with PET-CT was 23% higher than for patients not examined with PET-CT. (orig.)
Jamal Talabani, A; Endreseth, B H; Lydersen, S; Edna, T-H
2017-01-01
The study investigated the capability of clinical findings, temperature, C-reactive protein (CRP), and white blood cell (WBC) count to discern patients with acute colonic diverticulitis from all other patients admitted with acute abdominal pain. The probability of acute diverticulitis was assessed by the examining doctor, using a scale from 0 (zero probability) to 10 (100 % probability). Receiver operating characteristic (ROC) curves were used to assess the clinical diagnostic accuracy of acute colonic diverticulitis in patients admitted with acute abdominal pain. Of 833 patients admitted with acute abdominal pain, 95 had acute colonic diverticulitis. ROC curve analysis gave an area under the ROC curve (AUC) of 0.95 (CI 0.92 to 0.97) for ages patients. Separate analysis showed an AUC = 0.83 (CI 0.80 to 0.86) of CRP alone. White blood cell count and temperature were almost useless to discriminate acute colonic diverticulitis from other types of acute abdominal pain, AUC = 0.59 (CI 0.53 to 0.65) for white blood cell count and AUC = 0.57 (0.50 to 0.63) for temperature, respectively. This prospective study demonstrates that standard clinical evaluation by non-specialist doctors based on history, physical examination, and initial blood tests on admission provides a high degree of diagnostic precision in patients with acute colonic diverticulitis.
International Nuclear Information System (INIS)
Llacer, J.; Veklerov, E.; Nolan, D.; Grafton, S.T.; Mazziotta, J.C.; Hawkins, R.A.; Hoh, C.K.; Hoffman, E.J.
1990-10-01
This paper will report on the progress to date in carrying out Receiver Operating Characteristics (ROC) studies comparing Maximum Likelihood Estimator (MLE) and Filtered Backprojection (FBP) reconstructions of normal and abnormal human brain PET data in a clinical setting. A previous statistical study of reconstructions of the Hoffman brain phantom with real data indicated that the pixel-to-pixel standard deviation in feasible MLE images is approximately proportional to the square root of the number of counts in a region, as opposed to a standard deviation which is high and largely independent of the number of counts in FBP. A preliminary ROC study carried out with 10 non-medical observers performing a relatively simple detectability task indicates that, for the majority of observers, lower standard deviation translates itself into a statistically significant detectability advantage in MLE reconstructions. The initial results of ongoing tests with four experienced neurologists/nuclear medicine physicians are presented. Normal cases of 18 F -- fluorodeoxyglucose (FDG) cerebral metabolism studies and abnormal cases in which a variety of lesions have been introduced into normal data sets have been evaluated. We report on the results of reading the reconstructions of 90 data sets, each corresponding to a single brain slice. It has become apparent that the design of the study based on reading single brain slices is too insensitive and we propose a variation based on reading three consecutive slices at a time, rating only the center slice. 9 refs., 2 figs., 1 tab
Marsman, M.; Wagenmakers, E.-J.
2017-01-01
We illustrate the Bayesian approach to data analysis using the newly developed statistical software program JASP. With JASP, researchers are able to take advantage of the benefits that the Bayesian framework has to offer in terms of parameter estimation and hypothesis testing. The Bayesian
1982-07-01
Plant and system level operating instructions are provided for the Barstow Solar Pilot Plant. Individual status instructions are given that identify plant conditions, process controller responsibilities, process conditions and control accuracies, operating envelopes, and operator cautions appropriate to the operating condition. Transition operating instructions identify the sequence of activities to be carried out to accomplish the indicated transition. Most transitions involve the startup or shutdown of an individual flowpath. Background information is provided on collector field operations, and the heliostat groupings and specific commands used in support receiver startup are defined.
International Nuclear Information System (INIS)
Yang, Kai-Lin; Chang, Shih-Ching; Chu, Lee-Shing; Wang, Ling-Wei; Yang, Shung-Haur; Liang, Wen-Yih; Kuo, Ying-Ju; Lin, Jen-Kou; Lin, Tzu-Chen; Chen, Wei-Shone; Jiang, Jeng-Kae; Wang, Huann-Sheng
2013-01-01
To investigate serum carcinoembryonic antigen (CEA) as a prognostic factor for rectal cancer patients receiving pre-operative chemoradiotherapy (CRT). Between 2000 and 2009, 138 patients with advanced rectal cancer receiving CRT before surgery at our hospital were retrospectively classified into 3 groups: pre-CRT CEA <6 ng/ml (group L; n = 87); pre-CRT CEA ≥ 6 ng/ml and post-CRT CEA <6 ng/ml (group H-L; n = 32); and both pre- and post-CRT CEA ≥ 6 ng/ml (group H-H; n = 19). CEA ratio (defined as post-CRT CEA divided by pre-CRT CEA), post-CRT CEA level and other factors were reviewed for prediction of pathologic complete response (pCR). Five-year disease-free survival (DFS) was better in groups L (69.0%) and H-L (74.5%) than in group H-H (44.9%) (p = 0.024). Pathologic complete response was observed in 19.5%, 21.9% and 5.3% of groups L, H-L and H-H respectively (p = 0.281). Multivariate analysis showed that ypN stage and pCR were independent prognostic factors for DFS and that post-CRT CEA level was independently predictive of pCR. As a whole, post-CRT CEA <2.61 ng/ml predicted pCR (sensitivity 76.0%; specificity 58.4%). For those with pre-CRT CEA ≥6 ng/ml, post-CRT CEA and CEA ratio both predicted pCR (sensitivity 87.5%, specificity 76.7%). In patients with pre-CRT serum CEA ≥6 ng/ml, those with “normalized” CEA levels after CRT may have similar DFS to those with “normal” (<6 ng/ml) pre-CRT values. Post-CRT CEA level is a predictor for pCR, especially in those with pre-CRT CEA ≥6 ng/ml
Sun, Yuhang; Wang, Bo; Shu, Shi; Zhang, Hongyou; Xu, Chuang; Wu, Ling; Xia, Cheng
2015-01-01
Fatty liver syndrome and ketosis are important metabolic disorders in high-producing cows during early lactation with fatty liver usually preceding ketosis. To date, parameters for early prediction of the risk of ketosis have not been investigated in China. To determine the predictive value of some parameters on the risk of ketosis in China. In a descriptive study, 48 control and 32 ketotic Holstein Friesian cows were randomly selected from one farm with a serum β-hydroxybutyrate (BHBA) concentration of 1.20 mmol/L as cutoff point. The risk prediction thresholds for ketosis were determined by receiver operating characteristic (ROC) analysis. In line with a high BHBA concentration, blood glucose concentration was significantly lower in ketotic cows compared to control animals (2.77 ± 0.24 versus 3.34 ± 0.03 mmol/L; P = 0.02). Thresholds were more than 0.76 mmol/L for nonesterified fatty acids (NEFA, with 65% sensitivity and 92% specificity), more than 104 U/L for aspartate aminotransferase (AST, 74% and 85%, respectively), less than 140 U/L for cholinesterase (CHE, 75% and 59%, respectively), and more than 3.3 µmol/L for total bilirubin (TBIL, 58% and 83%, respectively). There were significant correlations between BHBA and glucose (R = -4.74), or CHE (R = -0.262), BHBA and NEFA (R = 0.520), or AST (R = 0.525), or TBIL (R = 0.278), or direct bilirubin (DBIL, R = 0.348). AST, CHE, TBIL and NEFA may be useful parameters for risk prediction of ketosis. This study might be of value in addressing novel directions for future research on the connection between ketosis and liver dysfunction.
Riggio, Valentina; Pesce, Lorenzo L; Morreale, Salvatore; Portolano, Baldassare
2013-06-01
Using receiver-operating characteristic (ROC) curve methodology this study was designed to assess the diagnostic effectiveness of somatic cell count (SCC) and the California mastitis test (CMT) in Valle del Belice sheep, and to propose and evaluate threshold values for those tests that would optimally discriminate between healthy and infected udders. Milk samples (n=1357) were collected from 684 sheep in four flocks. The prevalence of infection, as determined by positive bacterial culture was 0.36, 87.7% of which were minor and 12.3% major pathogens. Of the culture negative samples, 83.7% had an SCCculture negative vs. infected), minor pathogen status (culture negative vs. infected with minor pathogens), major pathogen status (culture negative vs. infected with major pathogens), and CMT results were evaluated, the estimated area under the ROC curve was greater for glands infected with major compared to minor pathogens (0.88 vs. 0.73), whereas the area under the curve considering all pathogens was similar to the one for minor pathogens (0.75). The estimated optimal thresholds were 3.00 (CMT), 2.81 (SCS for the whole sample), 2.81 (SCS for minor pathogens), and 3.33 (SCS for major pathogens). These correctly classified, respectively, 69.0%, 73.5%, 72.6% and 91.0% of infected udders in the samples. The CMT appeared only to discriminate udders infected with major pathogens. In this population, SCS appeared to be the best indirect test of the bacteriological status of the udder. Copyright © 2012 Elsevier Ltd. All rights reserved.
Fujita, Takaaki; Sato, Atsushi; Tsuchiya, Kenji; Ohashi, Takuro; Yamane, Kazuhiro; Yamamoto, Yuichi; Iokawa, Kazuaki; Ohira, Yoko; Otsuki, Koji; Tozato, Fusae
2017-12-01
This study aimed to elucidate the relationship between grooming performance of stroke patients and various motor and cognitive functions and to examine the cognitive and physical functional standards required for grooming independence. We retrospectively analyzed the data of 96 hospitalized patients with first stroke in a rehabilitation hospital ward. Logistic regression analysis and receiver operating characteristic curves were used to investigate the related cognitive and motor functions with grooming performance and to calculate the cutoff values for independence and supervision levels in grooming. For analysis between the independent and supervision-dependent groups, the only item with an area under the curve (AUC) of .9 or higher was the Berg Balance Scale, and the calculated cutoff value was 41/40 (sensitivity, 83.6%; specificity, 87.8%). For analysis between the independent-supervision and dependent groups, the items with an AUC of .9 or higher were the Simple Test for Evaluating Hand Function (STEF) on the nonaffected side, Vitality Index (VI), and FIM ® cognition. The cutoff values were 68/67 for the STEF (sensitivity, 100%; specificity, 72.2%), 9/8 points for the VI (sensitivity, 92.3%; specificity, 88.9%), and 23/22 points for FIM ® cognition (sensitivity, 91.0%; specificity, 88.9%). Our results suggest that upper-extremity functions on the nonaffected side, motivation, and cognitive functions are particularly important to achieve the supervision level and that balance is important to reach the independence level. The effective improvement of grooming performance is possible by performing therapeutic or compensatory intervention on functions that have not achieved these cutoff values. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Space Shuttle RTOS Bayesian Network
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Learning Bayesian networks for discrete data
Liang, Faming
2009-02-01
Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.
Fully probabilistic design of hierarchical Bayesian models
Czech Academy of Sciences Publication Activity Database
Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine
2016-01-01
Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross-entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf
Predicting Hospital Admission for Emergency Department Patients using a Bayesian Network
Leegon, Jeffrey; Jones, Ian; Lanaghan, Kevin; Aronsky, Dominik
2005-01-01
Hospital admission delays in the Emergency Department (ED) reduce volume capacity and contribute to the nation’s ED diversion problem. This study evaluated the accuracy of a Bayesian network for the early prediction of hospital admission status using data from 16,900 ED encounters. The final model included nine variables that are commonly available in many ED settings. The area under the receiver operating characteristic curve was 0.894 (95% CI: 0.887-0.902) for the validati...
1995-04-01
This document is a checklist designed to assist Federal Aviation Administration(FAA) certification personnel and global : positioning system (GPS) receiver manufacturers in the evaluation of the pilot-system interface characteristlcs of GPS : recieve...
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
System performance of a 4-channel PHASAR WDM receiver operating at 1.2 Gbit/s
Steenbergen, C.A.M.; van Deventer, M.O.; Vreede, de L.C.N.; Dam, van C.; Smit, M.K.; Verbeek, B.H.
1996-01-01
Phased arrays are important key components in wavelength-division multiplexing (WDM) systems. We have realized a 4-channel WDM receiver combining a phased array with photodetectors on InP with a Si bipolar transimpedance amplifier. The channels are spaced at 2.0 nm with a 1.0-nm flat passband. On
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What entities are eligible to receive funds... Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Funding and Selection of Service Providers § 670.300 What entities...
Koen, Joshua D.; Yonelinas, Andrew P.
2011-01-01
Receiver operating characteristics (ROCs) have been used extensively to study the processes underlying human recognition memory, and this method has recently been applied in studies of rats. However, the extent to which the results from human and animal studies converge is neither entirely clear, nor is it known how the different methods used to…
International Nuclear Information System (INIS)
2007-01-01
The Director General has received a communication dated 7 June 2007 from the Resident Representative of the Russian Federation, with an attachment entitled 'Establishment, Structure and Operation of the International Uranium Enrichment Centre'. As requested in that communication, the letter and its attachment are circulated for the information of Member States
deVries, SO; Hunink, MGM; Polak, JF
Rationale and Objectives. We summarized and compared the diagnostic performance of duplex and color-guided duplex ultrasonography in the evaluation of peripheral arterial disease. We present our research as an example of the use of summary receiver operating characteristic (ROC) curves in a
International Nuclear Information System (INIS)
Dennis, A.W.; Mulkin, R.
1984-01-01
The Nevada Nuclear Waste Storage Investigations Project, directed by the Nevada Operations Office of the Department of Energy, is currently developing conceptual designs for a commercial nuclear waste repository. In this paper, the preliminary repository operating plans are identified and the proposed repository waste inventory is discussed. The receipt rates for truck and rail car shipments of waste are determined as are the required repository waste emplacement rates
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Kleibergen, F.R.; Kleijn, R.; Paap, R.
2000-01-01
We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Fast Bayesian optimal experimental design for seismic source inversion
Long, Quan; Motamed, Mohammad; Tempone, Raul
2015-01-01
of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected
Albert, Jim
2009-01-01
There has been a dramatic growth in the development and application of Bayesian inferential methods. Some of this growth is due to the availability of powerful simulation-based algorithms to summarize posterior distributions. There has been also a growing interest in the use of the system R for statistical analyses. R's open source nature, free availability, and large number of contributor packages have made R the software of choice for many statisticians in education and industry. Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The earl
Bayesian data analysis for newcomers.
Kruschke, John K; Liddell, Torrin M
2018-02-01
This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.
Bayesian methods for data analysis
Carlin, Bradley P.
2009-01-01
Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...
Statistics: a Bayesian perspective
National Research Council Canada - National Science Library
Berry, Donald A
1996-01-01
...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...
Fox, Gerardus J.A.; van den Berg, Stéphanie Martine; Veldkamp, Bernard P.; Irwing, P.; Booth, T.; Hughes, D.
2015-01-01
In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education. Books Received. Articles in Resonance – Journal of Science Education. Volume 1 Issue 1 January 1996 pp 118-118 Books Received. Books Received · More Details Fulltext PDF. Volume 1 Issue 2 February 1996 pp 120-120 Books Received. Books Received.
Bayesian Networks An Introduction
Koski, Timo
2009-01-01
Bayesian Networks: An Introduction provides a self-contained introduction to the theory and applications of Bayesian networks, a topic of interest and importance for statisticians, computer scientists and those involved in modelling complex data sets. The material has been extensively tested in classroom teaching and assumes a basic knowledge of probability, statistics and mathematics. All notions are carefully explained and feature exercises throughout. Features include:.: An introduction to Dirichlet Distribution, Exponential Families and their applications.; A detailed description of learni
Maeda, Shin-ichi
2014-01-01
Dropout is one of the key techniques to prevent the learning from overfitting. It is explained that dropout works as a kind of modified L2 regularization. Here, we shed light on the dropout from Bayesian standpoint. Bayesian interpretation enables us to optimize the dropout rate, which is beneficial for learning of weight parameters and prediction after learning. The experiment result also encourages the optimization of the dropout.
Ghosh, Sujit K
2010-01-01
Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.
Ishin, Artem; Voeykov, Sergey; Perevalova, Natalia; Khakhinov, Vitaliy
2017-12-01
As a part of the Plasma-Progress and Radar-Progress space experiments conducted from 2006 to 2014, effects of the Progress spacecraft engines on the ionosphere have been studied using data from Global Navigation Satellite System (GNSS) receivers. 72 experiments have been carried out. All these experiments were based on data from the International GNSS Service (IGS) to record ionospheric plasma irregularities caused by engine operation. 35 experiments used data from the ISTP SB RAS network SibNet. The analysis of the spatio-temporal structure of total electron content (TEC) variations has shown that the problem of identifying the TEC response to engine operation is complicated by a number of factors: 1) the engine effect on ionospheric plasma is strongly localized in space and has a relatively low intensity; 2) a small number of satellite-receiver radio rays due to the limited number of GNSS stations, particularly before 2013; 3) a potential TEC response is masked with background ionospheric disturbances of various intensities. However, TEC responses are identified with certainty when a satellite-receiver radio ray crosses a disturbed region within minutes after the impact. TEC responses have been registered in 7 experiments (10 % of cases). The amplitude of ionospheric response (0.3-0.16 TECU) exceeded the background TEC variations (~0.25 TECU) several times. The TEC data indicate that the ionospheric irregularity lifetime is from 4 to 10 minutes. According to the estimates we made, the transverse size of irregularities is from 12 to 30 km.
Bayesian Methods and Universal Darwinism
Campbell, John
2009-12-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.
Universal Darwinism As a Process of Bayesian Inference.
Campbell, John O
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.
Universal Darwinism as a process of Bayesian inference
Directory of Open Access Journals (Sweden)
John Oberon Campbell
2016-06-01
Full Text Available Many of the mathematical frameworks describing natural selection are equivalent to Bayes’ Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians. As Bayesian inference can always be cast in terms of (variational free energy minimization, natural selection can be viewed as comprising two components: a generative model of an ‘experiment’ in the external world environment, and the results of that 'experiment' or the 'surprise' entailed by predicted and actual outcomes of the ‘experiment’. Minimization of free energy implies that the implicit measure of 'surprise' experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.
Bayesian networks with examples in R
Scutari, Marco
2014-01-01
Introduction. The Discrete Case: Multinomial Bayesian Networks. The Continuous Case: Gaussian Bayesian Networks. More Complex Cases. Theory and Algorithms for Bayesian Networks. Real-World Applications of Bayesian Networks. Appendices. Bibliography.
Jones, Matt; Love, Bradley C
2011-08-01
The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls
Bayesian methods in reliability
Sander, P.; Badoux, R.
1991-11-01
The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.
CSIR Research Space (South Africa)
Rosman, Benjamin
2016-02-01
Full Text Available Keywords Policy Reuse · Reinforcement Learning · Online Learning · Online Bandits · Transfer Learning · Bayesian Optimisation · Bayesian Decision Theory. 1 Introduction As robots and software agents are becoming more ubiquitous in many applications.... The agent has access to a library of policies (pi1, pi2 and pi3), and has previously experienced a set of task instances (τ1, τ2, τ3, τ4), as well as samples of the utilities of the library policies on these instances (the black dots indicate the means...
International Nuclear Information System (INIS)
Fintel, D.J.; Links, J.M.; Brinker, J.A.; Frank, T.L.; Parker, M.; Becker, L.C.
1989-01-01
Qualitative interpretation of tomographic and planar scintigrams, a five point rating scale and receiver operating characteristic analysis were utilized to compare single photon emission computed tomography and conventional planar imaging of myocardial thallium-201 uptake in the accuracy of the diagnosis of coronary artery disease and individual vessel involvement. One hundred twelve patients undergoing cardiac catheterization and 23 normal volunteers performed symptom-limited treadmill exercise, followed by stress and redistribution imaging by both tomographic and planar techniques, with the order determined randomly. Paired receiver operating characteristic curves revealed that single photon emission computed tomography was more accurate than planar imaging over the entire range of decision thresholds for the overall detection and exclusion of coronary artery disease and involvement of the left anterior descending and left circumflex coronary arteries. Tomography offered relatively greater advantages in male patients and in patients with milder forms of coronary artery disease, who had no prior myocardial infarction, only single vessel involvement or no lesion greater than or equal to 50 to 69%. Tomography did not appear to provide improved diagnosis in women or in detection of disease in the right coronary artery. Although overall detection of coronary artery disease was not improved in patients with prior myocardial infarction, tomography provided improved identification of normal and abnormal vascular regions. These results indicate that single photon emission computed tomography provides improved diagnostic performance compared with planar imaging in many clinical subgroups
International Nuclear Information System (INIS)
Hong, Hui; Liu, Qibin; Jin, Hongguang
2012-01-01
Highlights: ► A 15 kW solar chemical receiver/reactor for hydrogen production was developed. ► The solar thermochemical efficiency of the receiver/reactor was in the range of 20–28%. ► Hydrogen production exceeding 80% was achieved. ► The research results extend the application of mid-temperature solar thermal energy. -- Abstract: In this paper, we report the operational performance and energy conversion efficiency of a developed 15 kW solar chemical receiver/reactor for hydrogen production. A concentrated solar heat of around 200–300 °C was utilized to provide process heat to drive methanol steam reforming. A modified 15 kW direct-irradiation solar reactor coupled with a linear receiver positioned along the focal line of a one-axis parabolic trough concentrator was used. The experiments were conducted from 200 to 300 °C under a mean solar flux of 300–800 W/m 2 and a reactant feeding rate of 6 kg/h. Reactants were continuously fed, and the attained conversion rate of methanol was more than 70% at 700 W/m 2 . The typical solar thermochemical efficiency of solar thermal energy converted into chemical energy was in the 20–28% range. The overall energy efficiency of input solar power conversion into chemical energy reached up to 17% and may be further increased by improving solar field efficiency. Hydrogen production exceeding 80% was achieved. In addition, preliminary economic evaluation was performed, and methods for further improvement were proposed. This paper proves that solar hydrogen production is feasible by combining solar thermal energy with alternative fuel at around 200–300 °C, which is much lower than the temperature of other solar thermochemical processes. This may offer an economic approach to solar fuel production and extend the application of mid-temperature solar thermal energy.
DPpackage: Bayesian Semi- and Nonparametric Modeling in R
Directory of Open Access Journals (Sweden)
Alejandro Jara
2011-04-01
Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.
Bayesian methods for hackers probabilistic programming and Bayesian inference
Davidson-Pilon, Cameron
2016-01-01
Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Korattikara, A.; Rathod, V.; Murphy, K.; Welling, M.; Cortes, C.; Lawrence, N.D.; Lee, D.D.; Sugiyama, M.; Garnett, R.
2015-01-01
We consider the problem of Bayesian parameter estimation for deep neural networks, which is important in problem settings where we may have little data, and/ or where we need accurate posterior predictive densities p(y|x, D), e.g., for applications involving bandits or active learning. One simple
Bayesian Geostatistical Design
DEFF Research Database (Denmark)
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...
Bayesian statistical inference
Directory of Open Access Journals (Sweden)
Bruno De Finetti
2017-04-01
Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.
DEFF Research Database (Denmark)
Hartelius, Karsten; Carstensen, Jens Michael
2003-01-01
A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which r...
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Bayesian Exponential Smoothing.
Forbes, C.S.; Snyder, R.D.; Shami, R.S.
2000-01-01
In this paper, a Bayesian version of the exponential smoothing method of forecasting is proposed. The approach is based on a state space model containing only a single source of error for each time interval. This model allows us to improve current practices surrounding exponential smoothing by providing both point predictions and measures of the uncertainty surrounding them.
Bayesian optimization for materials science
Packwood, Daniel
2017-01-01
This book provides a short and concise introduction to Bayesian optimization specifically for experimental and computational materials scientists. After explaining the basic idea behind Bayesian optimization and some applications to materials science in Chapter 1, the mathematical theory of Bayesian optimization is outlined in Chapter 2. Finally, Chapter 3 discusses an application of Bayesian optimization to a complicated structure optimization problem in computational surface science. Bayesian optimization is a promising global optimization technique that originates in the field of machine learning and is starting to gain attention in materials science. For the purpose of materials design, Bayesian optimization can be used to predict new materials with novel properties without extensive screening of candidate materials. For the purpose of computational materials science, Bayesian optimization can be incorporated into first-principles calculations to perform efficient, global structure optimizations. While re...
International Nuclear Information System (INIS)
Kotlyarov, E.V.; Reba, R.C.; Lindsay, J.
1983-01-01
Receiver operating characteristic (ROC) analysis of left ventricular performance at rest was applied to evaluate diagnostic utility of non-imaging nuclear detector (''Nuclear Stethoscope''), for screening patients with coronary artery disease (CAD). Thirty-one patients without CAD and normal rest and stress radionuclide ventriculography (MUGA) were used as a control group. Another 62 patients with abnormal left ventricular reserve and segmental wall motion abnormalities at rest were also studied. All 93 patients were studied with the Nuclear Stethoscope (30 minutes after conventional MUGA testing) both in beat-to-beat and gated equilibrium modes. ROC analysis showed that along with ejection fraction, stroke and end-diastolic volumes, evaluation of the left ventricular filling phase has a great potential for the identification of patients with a segmental wall motion abnormality and, therefore, significant CAD
International Nuclear Information System (INIS)
McNeil, B.J.; Hanley, J.A.; Funkenstein, H.H.; Wallman, J.
1983-01-01
The use of a statistical technique for paired comparisons using receiver operating characteristic (ROC) curves is illustrated by studying the extent to which clinical history altered the interpretation of computed tomographic (CT) examinations of the head. Eighty-nine CT examinations of the head were presented in random order to four readers, first with minimum history (age and sex) and then several weeks later with complete neutrological history as of the time the CT examination had been obtained. Using a paired ROC analysis, a small but significant (p < .05) improvement was detected for the interpretations in the presence of complete history; for readings without history the average area was 94.4% and for readings with history it was 97.7%
DEFF Research Database (Denmark)
Gardner, Ian A.; Greiner, Matthias
2006-01-01
Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or...... different equipment) used by the same tester, as well as the accuracy of different diagnosticians that use the same test material. To date, ROC analysis has not been widely used in veterinary clinical pathology studies, although it should be considered a useful complement to estimates of sensitivity...... and specificity in test evaluation studies. In addition, calculation of likelihood ratios can potentially improve the clinical utility of such studies because likelihood ratios provide an indication of how the post-test probability changes as a function of the magnitude of the test results. For ordinal test...
Huang, Si-Si; Xie, Dong-Mei; Cai, Yi-Jing; Wu, Jian-Min; Chen, Rui-Chong; Wang, Xiao-Dong; Song, Mei; Zheng, Ming-Hua; Wang, Yu-Qun; Lin, Zhuo; Shi, Ke-Qing
2017-04-01
Hepatitis B virus (HBV) infection remains a major health problem and HBV-related-decompensated cirrhosis (HBV-DC) usually leads to a poor prognosis. Our aim was to determine the utility of inflammatory biomarkers in predicting mortality of HBV-DC. A total of 329 HBV-DC patients were enrolled. Survival estimates for the entire study population were generated using the Kaplan-Meier method. The prognostic values for model for end-stage liver disease (MELD) score, Child-Pugh score, and inflammatory biomarkers neutrophil/lymphocyte ratio, C-reactive protein-to-albumin ratio (CAR), and lymphocyte-to-monocyte ratio (LMR) for HBV-DC were compared using time-dependent receiver operating characteristic curves and time-dependent decision curves. The survival time was 23.1±15.8 months. Multivariate analysis identified age, CAR, LMR, and platelet count as prognostic independent risk factors. Kaplan-Meier analysis indicated that CAR of at least 1.0 (hazard ratio, 7.19; 95% confidence interval, 4.69-11.03), and LMR less than 1.9 (hazard ratio, 2.40; 95% confidence interval, 1.69-3.41) were independently associated with mortality of HBV-DC. The time-dependent receiver operating characteristic indicated that CAR showed the best performance in predicting mortality of HBV-DC compared with LMR, MELD score, and Child-Pugh score. The results were also confirmed by time-dependent decision curves. CAR and LMR were associated with the prognosis of HBV-DC. CAR was superior to LMR, MELD score, and Child-Pugh score in HBV-DC mortality prediction.
International Nuclear Information System (INIS)
Zanca, Federica; Hillis, Stephen L.; Claus, Filip; Van Ongeval, Chantal; Celis, Valerie; Provoost, Veerle; Yoon, Hong-Jun; Bosmans, Hilde
2012-01-01
Purpose: From independently conducted free-response receiver operating characteristic (FROC) and receiver operating characteristic (ROC) experiments, to study fixed-reader associations between three estimators: the area under the alternative FROC (AFROC) curve computed from FROC data, the area under the ROC curve computed from FROC highest rating data, and the area under the ROC curve computed from confidence-of-disease ratings. Methods: Two hundred mammograms, 100 of which were abnormal, were processed by two image-processing algorithms and interpreted by four radiologists under the FROC paradigm. From the FROC data, inferred-ROC data were derived, using the highest rating assumption. Eighteen months afterwards, the images were interpreted by the same radiologists under the conventional ROC paradigm; conventional-ROC data (in contrast to inferred-ROC data) were obtained. FROC and ROC (inferred, conventional) data were analyzed using the nonparametric area-under-the-curve (AUC), (AFROC and ROC curve, respectively). Pearson correlation was used to quantify the degree of association between the modality-specific AUC indices and standard errors were computed using the bootstrap-after-bootstrap method. The magnitude of the correlations was assessed by comparison with computed Obuchowski-Rockette fixed reader correlations. Results: Average Pearson correlations (with 95% confidence intervals in square brackets) were: Corr(FROC, inferred ROC) = 0.76[0.64, 0.84] > Corr(inferred ROC, conventional ROC) = 0.40[0.18, 0.58] > Corr (FROC, conventional ROC) = 0.32[0.16, 0.46]. Conclusions: Correlation between FROC and inferred-ROC data AUC estimates was high. Correlation between inferred- and conventional-ROC AUC was similar to the correlation between two modalities for a single reader using one estimation method, suggesting that the highest rating assumption might be questionable.
Terluin, Berend; Eekhout, Iris; Terwee, Caroline B
2017-03-01
Patients have their individual minimal important changes (iMICs) as their personal benchmarks to determine whether a perceived health-related quality of life (HRQOL) change constitutes a (minimally) important change for them. We denote the mean iMIC in a group of patients as the "genuine MIC" (gMIC). The aims of this paper are (1) to examine the relationship between the gMIC and the anchor-based minimal important change (MIC), determined by receiver operating characteristic analysis or by predictive modeling; (2) to examine the impact of the proportion of improved patients on these MICs; and (3) to explore the possibility to adjust the MIC for the influence of the proportion of improved patients. Multiple simulations of patient samples involved in anchor-based MIC studies with different characteristics of HRQOL (change) scores and distributions of iMICs. In addition, a real data set is analyzed for illustration. The receiver operating characteristic-based and predictive modeling MICs equal the gMIC when the proportion of improved patients equals 0.5. The MIC is estimated higher than the gMIC when the proportion improved is greater than 0.5, and the MIC is estimated lower than the gMIC when the proportion improved is less than 0.5. Using an equation including the predictive modeling MIC, the log-odds of improvement, the standard deviation of the HRQOL change score, and the correlation between the HRQOL change score and the anchor results in an adjusted MIC reflecting the gMIC irrespective of the proportion of improved patients. Adjusting the predictive modeling MIC for the proportion of improved patients assures that the adjusted MIC reflects the gMIC. We assumed normal distributions and global perceived change scores that were independent on the follow-up score. Additionally, floor and ceiling effects were not taken into account. Copyright © 2017 Elsevier Inc. All rights reserved.
Non-linear Bayesian update of PCE coefficients
Litvinenko, Alexander
2014-01-06
Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).
Non-linear Bayesian update of PCE coefficients
Litvinenko, Alexander; Matthies, Hermann G.; Pojonk, Oliver; Rosic, Bojana V.; Zander, Elmar
2014-01-01
Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).
A Bayesian nonparametric approach to causal inference on quantiles.
Xu, Dandan; Daniels, Michael J; Winterstein, Almut G
2018-02-25
We propose a Bayesian nonparametric approach (BNP) for causal inference on quantiles in the presence of many confounders. In particular, we define relevant causal quantities and specify BNP models to avoid bias from restrictive parametric assumptions. We first use Bayesian additive regression trees (BART) to model the propensity score and then construct the distribution of potential outcomes given the propensity score using a Dirichlet process mixture (DPM) of normals model. We thoroughly evaluate the operating characteristics of our approach and compare it to Bayesian and frequentist competitors. We use our approach to answer an important clinical question involving acute kidney injury using electronic health records. © 2018, The International Biometric Society.
2005-01-01
The invention is directed to the reception of high rate radio signals (for example DVB-T signals) while the receiver is moving at a high speed (for example in or with a car). Two or more antennas (12, 16) are closely spaced and arranged behind each other in the direction of motion (v) for receiving
Shen, Jin-Chun; Sun, He-Liang; Zhang, Ming-Qiang; Liu, Xiao-Yu; Wang, Zhong- Yun; Yang, Jian-Jun
2014-08-01
Acute pain can lead to immune dysfunction, which can be partly ameliorated by successful pain management. Opioids, which are widely used for analgesia, can result in the deterioration of immune function. This study aimed to investigate the influence of morphine with or without flurbiprofen as post-operative analgesics on the immune systems of patients undergoing gastric cancer surgery. 60 patients undergoing gastric cancer surgery were equally randomized into two groups. They received post-operative patient-controlled intravenous (IV) analgesia using morphine either with or without flurbiprofen. Visual analogue scale (VAS) scores, Bruggemann comfort scale (BCS) scores, morphine consumption, time of first flatus, incidence of nausea/vomiting, and T-lymphocyte subsets (CD3⁺, CD4⁺, and CD8⁺) and natural killer cells (CD3⁻CD16⁺CD56⁺) were evaluated. No significant difference was observed in the VAS scores, BCS scores, and nausea/vomiting incidence between groups. Less morphine was consumed and the time of first flatus was earlier in patients receiving morphine with flurbiprofen than morphine alone. The expression of CD3⁺, CD4⁺, CD4⁺/CD8⁺, and CD3⁻CD16⁺CD56⁺ decreased at 2 hours after incision and, except for CD3⁻CD16⁺CD56⁺, returned to baseline at 120 hours after surgery. Moreover, the expression of CD3⁻CD16⁺CD56⁺ at 2 hours after incision and the expression of CD3⁺, CD4⁺, CD4⁺/CD8⁺, and CD3⁻CD16⁺CD56⁺ at 24 hours after surgery were higher in patients receiving morphine with flurbiprofen than morphine alone. The combination of morphine and flurbiprofen ameliorates the immune depression in Tlymphocyte subsets and natural killer cells and provides a similar analgesic efficacy to morphine alone in patients undergoing gastric cancer surgery.
On Bayesian System Reliability Analysis
Energy Technology Data Exchange (ETDEWEB)
Soerensen Ringi, M
1995-05-01
The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.
On Bayesian System Reliability Analysis
International Nuclear Information System (INIS)
Soerensen Ringi, M.
1995-01-01
The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs
Thomas, Claudine
1994-01-01
The Global Positioning System is an outstanding tool for the dissemination of time. Using mono-channel C/A-code GPS time receivers, the restitution of GPS time through the satellite constellation presents a peak-to-peak discrepancy of several tens of nanoseconds without SA but may be as high as several hundreds of nanoseconds with SA. As a consequence, civil users are more and more interested in implementing hardware and software methods for efficient restitution of GPS time, especially in the framework of the project of a real-time prediction of UTC (UTCp) which could be available in the form of time differences (UTCp - GPS time). Previous work, for improving the real-time restitution of GPS time with SA, to the level obtained without SA, focused on the implementation of a Kalman filter based on past data and updated at each new observation. An alternative solution relies upon the statistical features of the noise brought about by SA; it has already been shown that the SA noise is efficiently reduced by averaging data from numerous satellites observed simultaneously over a sufficiently long time. This method was successfully applied to data from a GPS time receiver, model AOA TTR-4P, connected to the cesium clock kept at the BIPM. This device, a multi-channel, dual frequency, P-code GPS time receiver, is one of the first TTR-4P units in operation in a civil laboratory. Preliminary comparative studies of this new equipment with conventional GPS time receivers are described in this paper. The results of an experimental restitution of GPS time, obtained in June 1993, are also detailed: 3 to 6 satellites were observed simultaneously with a sample interval of 15 s, an efficient smoothing of SA noise was realized by averaging data on all observed satellites over more than 1 hour. When the GPS system is complete in 1994, 8 satellites will be observable continuously from anywhere in the world and the same level of uncertainty will be obtained using a shorter averaging
Accelerated aging tests on ENEA-ASE solar coating for receiver tube suitable to operate up to 550 °C
Antonaia, A.; D'Angelo, A.; Esposito, S.; Addonizio, M. L.; Castaldo, A.; Ferrara, M.; Guglielmo, A.; Maccari, A.
2016-05-01
A patented solar coating for evacuated receiver, based on innovative graded WN-AlN cermet layer, has been optically designed and optimized to operate at high temperature with high performance and high thermal stability. This solar coating, being designed to operate in solar field with molten salt as heat transfer fluid, has to be thermally stable up to the maximum temperature of 550 °C. With the aim of determining degradation behaviour and lifetime prediction of the solar coating, we chose to monitor the variation of the solar absorptance αs after each thermal annealing cycle carried out at accelerated temperatures under vacuum. This prediction method was coupled with a preliminary Differential Thermal Analysis (DTA) in order to give evidence for any chemical-physical coating modification in the temperature range of interest before performing accelerated aging tests. In the accelerated aging tests we assumed that the temperature dependence of the degradation processes could be described by Arrhenius behaviour and we hypothesized that a linear correlation occurs between optical parameter variation rate (specifically, Δαs/Δt) and degradation process rate. Starting from Δαs/Δt values evaluated at 650 and 690 °C, Arrhenius plot gave an activation energy of 325 kJ mol-1 for the degradation phenomenon, where the prediction on the coating degradation gave a solar absorptance decrease of only 1.65 % after 25 years at 550 °C. This very low αs decrease gave evidence for an excellent stability of our solar coating, also when employed at the maximum temperature (550 °C) of a solar field operating with molten salt as heat transfer fluid.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
DEFF Research Database (Denmark)
Mørup, Morten; Schmidt, Mikkel N
2012-01-01
Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....
An Analysis of Construction Accident Factors Based on Bayesian Network
Yunsheng Zhao; Jinyong Pei
2013-01-01
In this study, we have an analysis of construction accident factors based on bayesian network. Firstly, accidents cases are analyzed to build Fault Tree method, which is available to find all the factors causing the accidents, then qualitatively and quantitatively analyzes the factors with Bayesian network method, finally determines the safety management program to guide the safety operations. The results of this study show that bad condition of geological environment has the largest posterio...
DEFF Research Database (Denmark)
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimenta...... economics, with careful controls for the confounding effects of risk aversion. Our results show that risk aversion significantly alters inferences on deviations from Bayes’ Rule....
Energy Technology Data Exchange (ETDEWEB)
Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-11-15
These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ^{2} which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H_{0}.
Introduction to Bayesian statistics
Koch, Karl-Rudolf
2007-01-01
This book presents Bayes' theorem, the estimation of unknown parameters, the determination of confidence regions and the derivation of tests of hypotheses for the unknown parameters. It does so in a simple manner that is easy to comprehend. The book compares traditional and Bayesian methods with the rules of probability presented in a logical way allowing an intuitive understanding of random variables and their probability distributions to be formed.
Bayesian ARTMAP for regression.
Sasu, L M; Andonie, R
2013-10-01
Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
Bayesian estimation of Weibull distribution parameters
International Nuclear Information System (INIS)
Bacha, M.; Celeux, G.; Idee, E.; Lannoy, A.; Vasseur, D.
1994-11-01
In this paper, we expose SEM (Stochastic Expectation Maximization) and WLB-SIR (Weighted Likelihood Bootstrap - Sampling Importance Re-sampling) methods which are used to estimate Weibull distribution parameters when data are very censored. The second method is based on Bayesian inference and allow to take into account available prior informations on parameters. An application of this method, with real data provided by nuclear power plants operation feedback analysis has been realized. (authors). 8 refs., 2 figs., 2 tabs
Narrowband interference parameterization for sparse Bayesian recovery
Ali, Anum
2015-09-11
This paper addresses the problem of narrowband interference (NBI) in SC-FDMA systems by using tools from compressed sensing and stochastic geometry. The proposed NBI cancellation scheme exploits the frequency domain sparsity of the unknown signal and adopts a Bayesian sparse recovery procedure. This is done by keeping a few randomly chosen sub-carriers data free to sense the NBI signal at the receiver. As Bayesian recovery requires knowledge of some NBI parameters (i.e., mean, variance and sparsity rate), we use tools from stochastic geometry to obtain analytical expressions for the required parameters. Our simulation results validate the analysis and depict suitability of the proposed recovery method for NBI mitigation. © 2015 IEEE.
Bayesian analysis in plant pathology.
Mila, A L; Carriquiry, A L
2004-09-01
ABSTRACT Bayesian methods are currently much discussed and applied in several disciplines from molecular biology to engineering. Bayesian inference is the process of fitting a probability model to a set of data and summarizing the results via probability distributions on the parameters of the model and unobserved quantities such as predictions for new observations. In this paper, after a short introduction of Bayesian inference, we present the basic features of Bayesian methodology using examples from sequencing genomic fragments and analyzing microarray gene-expressing levels, reconstructing disease maps, and designing experiments.
International Nuclear Information System (INIS)
Iwai, Naomichi; Yamaguchi, Yutaka
1991-01-01
MRI was performed in 78 primary lung cancer cases to evaluate the optimal diagnostic criteria for regional lymph node metastases. Receiver operating characteristic (ROC) curve analysis for 262 lymph nodes of the hilar and mediastinal regions showed that the optimal size criterion is 10 mm in the mean axis of nodal diameter. Employing this criterion, the diagnostic rates for hilar and mediastinal lymph nodes had a sensitivity of 75%, a specificity of 82%, and an overall accuracy of 79%. However, the diagnostic rates for subaortic, paraaortic, and hilar lymph nodes using the same criterion showed lower specificities than those for other nodes. It was suggested that evaluation by coronal section made the diagnosis for subaortic lymph nodes more precise. In the ROC curve analysis for each histologic type, it was thought that the optimal criterion for adenocarcinoma was 10 mm in the mean axis, and that the criteria for squamous cell carcinoma were 11 mm in the mean axis and 10 mm in the sort axis. (author)
Tomita, Tetsu; Yasui-Furukori, Norio; Norio, Yasui-Furukori; Sato, Yasushi; Nakagami, Taku; Tsuchimine, Shoko; Kaneda, Ayako; Kaneko, Sunao
2014-01-01
We investigated cutoff values for the early response of patients with major depressive disorder to paroxetine and their sex differences by using a receiver operating characteristic (ROC) curve analysis to predict the effectiveness of paroxetine. In total, 120 patients with major depressive disorder were enrolled and treated with 10-40 mg/day paroxetine for 6 weeks; 89 patients completed the protocol. A clinical evaluation using the Montgomery-Asberg Depression Rating Scale (MADRS) was performed at weeks 0, 1, 2, 4, and 6. In male subjects, the cutoff values for MADRS improvement rating in week 1, week 2, and week 4 were 20.9%, 34.9%, and 33.3%, respectively. The sensitivities and the specificities were 83.3% and 80.0%, 83.3% and 80.0%, and 100% and 90%, respectively. The areas under the curve (AUC) were 0.908, 0.821, and 0.979, respectively. In female subjects, the cutoff values for the MADRS improvement rating in week 1, week 2, and week 4 were 21.4%, 35.7%, and 32.3%, respectively. The sensitivities and the specificities were 71.4% and 84.6%, 73.8% and 76.9%, and 90.5% and 76.9%, respectively. The AUCs were 0.781, 0.735, and 0.904, respectively. Early improvement with paroxetine may predict the long-term response. The accuracy of the prediction for the response is higher in male subjects.
Mohammadi, Hassanreza R; Azimi, Parisa; Benzel, Edward C; Shahzadi, Sohrab; Azhari, Shirzad
2016-09-01
The aim of this study was to elucidate independent factors that predict surgical satisfaction in lumbar spinal canal stenosis (LSCS) patients. Patients who underwent surgery were grouped based on the age, gender, duration of symptoms, walking distance, Neurogenic Claudication Outcome Score (NCOS) and the stenosis ratio (SR) described by Lurencin. We recorded on 2-year patient satisfaction using standardized measure. The optimal cut-off points in SR, NCOS and walking distance for predicting surgical satisfaction were estimated from sensitivity and specificity calculations and receiver operator characteristic (ROC) curves. One hundred fifty consecutive patients (51 male, 99 female, mean age 62.4±10.9 years) were followed up for 34±13 months (range 24-49). One, two, three and four level stenosis was observed in 10.7%, 39.3%, 36.0 % and 14.0% of patients, respectively. Post-surgical satisfaction was 78.5% at the 2 years follow up. In ROC curve analysis, the asymptotic significance is less than 0.05 in SR and the optimal cut-off value of SR to predict worsening surgical satisfaction was measured as more than 0.52, with 85.4% sensitivity and 77.4% specificity (AUC 0.798, 95% CI 0.73-0.90; Ppatients with degenerative lumbar stenosis considered for surgical treatment. Using a ROC curve analysis, a radiological feature, the SR, demonstrated superiority in predicting patient satisfaction, compared to functional and clinical characteristics such as walking distance and NCOS.
International Nuclear Information System (INIS)
Halligan, Steve; Altman, Douglas G.; Mallett, Susan
2015-01-01
The objectives are to describe the disadvantages of the area under the receiver operating characteristic curve (ROC AUC) to measure diagnostic test performance and to propose an alternative based on net benefit. We use a narrative review supplemented by data from a study of computer-assisted detection for CT colonography. We identified problems with ROC AUC. Confidence scoring by readers was highly non-normal, and score distribution was bimodal. Consequently, ROC curves were highly extrapolated with AUC mostly dependent on areas without patient data. AUC depended on the method used for curve fitting. ROC AUC does not account for prevalence or different misclassification costs arising from false-negative and false-positive diagnoses. Change in ROC AUC has little direct clinical meaning for clinicians. An alternative analysis based on net benefit is proposed, based on the change in sensitivity and specificity at clinically relevant thresholds. Net benefit incorporates estimates of prevalence and misclassification costs, and it is clinically interpretable since it reflects changes in correct and incorrect diagnoses when a new diagnostic test is introduced. ROC AUC is most useful in the early stages of test assessment whereas methods based on net benefit are more useful to assess radiological tests where the clinical context is known. Net benefit is more useful for assessing clinical impact. (orig.)
Directory of Open Access Journals (Sweden)
Provvidenza M. Abruzzo
2015-01-01
Full Text Available Autism Spectrum Disorders (ASD are a heterogeneous group of neurodevelopmental disorders. Recognized causes of ASD include genetic factors, metabolic diseases, toxic and environmental factors, and a combination of these. Available tests fail to recognize genetic abnormalities in about 70% of ASD children, where diagnosis is solely based on behavioral signs and symptoms, which are difficult to evaluate in very young children. Although it is advisable that specific psychotherapeutic and pedagogic interventions are initiated as early as possible, early diagnosis is hampered by the lack of nongenetic specific biological markers. In the past ten years, the scientific literature has reported dozens of neurophysiological and biochemical alterations in ASD children; however no real biomarker has emerged. Such literature is here reviewed in the light of Receiver Operating Characteristic (ROC analysis, a very valuable statistical tool, which evaluates the sensitivity and the specificity of biomarkers to be used in diagnostic decision making. We also apply ROC analysis to some of our previously published data and discuss the increased diagnostic value of combining more variables in one ROC curve analysis. We also discuss the use of biomarkers as a tool for advancing our understanding of nonsyndromic ASD.
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Bayesian-based localization in inhomogeneous transmission media
DEFF Research Database (Denmark)
Nadimi, E. S.; Blanes-Vidal, V.; Johansen, P. M.
2013-01-01
In this paper, we propose a novel robust probabilistic approach based on the Bayesian inference using received-signal-strength (RSS) measurements with varying path-loss exponent. We derived the probability density function (pdf) of the distance between any two sensors in the network with heteroge......In this paper, we propose a novel robust probabilistic approach based on the Bayesian inference using received-signal-strength (RSS) measurements with varying path-loss exponent. We derived the probability density function (pdf) of the distance between any two sensors in the network...... with heterogeneous transmission medium as a function of the given RSS measurements and the characteristics of the heterogeneous medium. The results of this study show that the localization mean square error (MSE) of the Bayesian-based method outperformed all other existing localization approaches. © 2013 ACM....
The NIFTY way of Bayesian signal inference
International Nuclear Information System (INIS)
Selig, Marco
2014-01-01
We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D 3 PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy
The NIFTy way of Bayesian signal inference
Selig, Marco
2014-12-01
We introduce NIFTy, "Numerical Information Field Theory", a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTy can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTy as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.
Software Health Management with Bayesian Networks
Mengshoel, Ole; Schumann, JOhann
2011-01-01
Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.
Inference in hybrid Bayesian networks
DEFF Research Database (Denmark)
Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael
2009-01-01
Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....
Searching Algorithm Using Bayesian Updates
Caudle, Kyle
2010-01-01
In late October 1967, the USS Scorpion was lost at sea, somewhere between the Azores and Norfolk Virginia. Dr. Craven of the U.S. Navy's Special Projects Division is credited with using Bayesian Search Theory to locate the submarine. Bayesian Search Theory is a straightforward and interesting application of Bayes' theorem which involves searching…
Bayesian Data Analysis (lecture 2)
CERN. Geneva
2018-01-01
framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.
Bayesian Data Analysis (lecture 1)
CERN. Geneva
2018-01-01
framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.
An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.
Wong, Rowena Syn Yin; Ismail, Noor Azina
2016-01-01
There are not many studies that attempt to model intensive care unit (ICU) risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU. This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV) model. Bayesian Markov Chain Monte Carlo (MCMC) simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method. The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS) was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC) values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05) for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study. Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.
An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.
Directory of Open Access Journals (Sweden)
Rowena Syn Yin Wong
Full Text Available There are not many studies that attempt to model intensive care unit (ICU risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU.This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV model. Bayesian Markov Chain Monte Carlo (MCMC simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method.The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05 for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study.Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.
The Bayesian Covariance Lasso.
Khondker, Zakaria S; Zhu, Hongtu; Chu, Haitao; Lin, Weili; Ibrahim, Joseph G
2013-04-01
Estimation of sparse covariance matrices and their inverse subject to positive definiteness constraints has drawn a lot of attention in recent years. The abundance of high-dimensional data, where the sample size ( n ) is less than the dimension ( d ), requires shrinkage estimation methods since the maximum likelihood estimator is not positive definite in this case. Furthermore, when n is larger than d but not sufficiently larger, shrinkage estimation is more stable than maximum likelihood as it reduces the condition number of the precision matrix. Frequentist methods have utilized penalized likelihood methods, whereas Bayesian approaches rely on matrix decompositions or Wishart priors for shrinkage. In this paper we propose a new method, called the Bayesian Covariance Lasso (BCLASSO), for the shrinkage estimation of a precision (covariance) matrix. We consider a class of priors for the precision matrix that leads to the popular frequentist penalties as special cases, develop a Bayes estimator for the precision matrix, and propose an efficient sampling scheme that does not precalculate boundaries for positive definiteness. The proposed method is permutation invariant and performs shrinkage and estimation simultaneously for non-full rank data. Simulations show that the proposed BCLASSO performs similarly as frequentist methods for non-full rank data.
Bayesian dynamic mediation analysis.
Huang, Jing; Yuan, Ying
2017-12-01
Most existing methods for mediation analysis assume that mediation is a stationary, time-invariant process, which overlooks the inherently dynamic nature of many human psychological processes and behavioral activities. In this article, we consider mediation as a dynamic process that continuously changes over time. We propose Bayesian multilevel time-varying coefficient models to describe and estimate such dynamic mediation effects. By taking the nonparametric penalized spline approach, the proposed method is flexible and able to accommodate any shape of the relationship between time and mediation effects. Simulation studies show that the proposed method works well and faithfully reflects the true nature of the mediation process. By modeling mediation effect nonparametrically as a continuous function of time, our method provides a valuable tool to help researchers obtain a more complete understanding of the dynamic nature of the mediation process underlying psychological and behavioral phenomena. We also briefly discuss an alternative approach of using dynamic autoregressive mediation model to estimate the dynamic mediation effect. The computer code is provided to implement the proposed Bayesian dynamic mediation analysis. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Approximate Bayesian computation.
Directory of Open Access Journals (Sweden)
Mikael Sunnåker
Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.
Directory of Open Access Journals (Sweden)
Wendy Thanassi
2012-01-01
Full Text Available Objective. To find a statistically significant separation point for the QuantiFERON Gold In-Tube (QFT interferon gamma release assay that could define an optimal “retesting zone” for use in serially tested low-risk populations who have test “reversions” from initially positive to subsequently negative results. Method. Using receiver operating characteristic analysis (ROC to analyze retrospective data collected from 3 major hospitals, we searched for predictors of reversion until statistically significant separation points were revealed. A confirmatory regression analysis was performed on an additional sample. Results. In 575 initially positive US healthcare workers (HCWs, 300 (52.2% had reversions, while 275 (47.8% had two sequential positive tests. The most statistically significant (Kappa = 0.48, chi-square = 131.0, P<0.001 separation point identified by the ROC for predicting reversion was the tuberculosis antigen minus-nil (TBag-nil value at 1.11 International Units per milliliter (IU/mL. The second separation point was found at TBag-nil at 0.72 IU/mL (Kappa = 0.16, chi-square = 8.2, P<0.01. The model was validated by the regression analysis of 287 HCWs. Conclusion. Reversion likelihood increases as the TBag-nil approaches the manufacturer's cut-point of 0.35 IU/mL. The most statistically significant separation point between those who test repeatedly positive and those who revert is 1.11 IU/mL. Clinicians should retest low-risk individuals with initial QFT results < 1.11 IU/mL.
Yun-Chia Ku, Michelle; Lo, Lun-Jou; Chen, Min-Chi; Wen-Ching Ko, Ellen
2018-03-01
The purpose of this study was to predict the need for orthognathic surgery in patients with unilateral cleft lip and palate (UCLP) in the early permanent dentition. In this retrospective cohort study, we included 61 patients with complete UCLP (36 male, 25 female; mean age, 18.47 years; range, 16.92-26.17 years). The subjects were grouped into an orthognathic surgery group and a nonsurgery group at the time of growth completion. Lateral cephalograms obtained at the age of 11 years were analyzed to compare the 2 groups. The receiver operating characteristic analysis was applied to predict the probability of the need for orthognathic surgery in early adulthood by using the measurements obtained at the age of 11 years. SNB, ANB, SN, overbite, overjet, maxillary length, mandibular body length, and L1-MP were found to be significantly different between the 2 groups. For a person with a score of 2 in the 3-variable-based criteria, the sensitivity and specificity for determining the need for surgical treatment were 90.0% and 83.9%, respectively (ANB, ≤-0.45°; overjet, ≤-2.00 mm; maxillary length, ≤47.25 mm). Three cephalometric variables, the minimum number of discriminators required to obtain the optimum discriminant effectiveness, predicted the future need for orthognathic surgery with an accuracy of 86.9% in patients with UCLP. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Main, Keith L; Soman, Salil; Pestilli, Franco; Furst, Ansgar; Noda, Art; Hernandez, Beatriz; Kong, Jennifer; Cheng, Jauhtai; Fairchild, Jennifer K; Taylor, Joy; Yesavage, Jerome; Wesson Ashford, J; Kraemer, Helena; Adamson, Maheen M
2017-01-01
Standard MRI methods are often inadequate for identifying mild traumatic brain injury (TBI). Advances in diffusion tensor imaging now provide potential biomarkers of TBI among white matter fascicles (tracts). However, it is still unclear which tracts are most pertinent to TBI diagnosis. This study ranked fiber tracts on their ability to discriminate patients with and without TBI. We acquired diffusion tensor imaging data from military veterans admitted to a polytrauma clinic (Overall n = 109; Age: M = 47.2, SD = 11.3; Male: 88%; TBI: 67%). TBI diagnosis was based on self-report and neurological examination. Fiber tractography analysis produced 20 fiber tracts per patient. Each tract yielded four clinically relevant measures (fractional anisotropy, mean diffusivity, radial diffusivity, and axial diffusivity). We applied receiver operating characteristic (ROC) analyses to identify the most diagnostic tract for each measure. The analyses produced an optimal cutpoint for each tract. We then used kappa coefficients to rate the agreement of each cutpoint with the neurologist's diagnosis. The tract with the highest kappa was most diagnostic. As a check on the ROC results, we performed a stepwise logistic regression on each measure using all 20 tracts as predictors. We also bootstrapped the ROC analyses to compute the 95% confidence intervals for sensitivity, specificity, and the highest kappa coefficients. The ROC analyses identified two fiber tracts as most diagnostic of TBI: the left cingulum (LCG) and the left inferior fronto-occipital fasciculus (LIF). Like ROC, logistic regression identified LCG as most predictive for the FA measure but identified the right anterior thalamic tract (RAT) for the MD, RD, and AD measures. These findings are potentially relevant to the development of TBI biomarkers. Our methods also demonstrate how ROC analysis may be used to identify clinically relevant variables in the TBI population.
Bayesian inference with ecological applications
Link, William A
2009-01-01
This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
Bayesian regression of piecewise homogeneous Poisson processes
Directory of Open Access Journals (Sweden)
Diego Sevilla
2015-12-01
Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015
Borsboom, D.; Haig, B.D.
2013-01-01
Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science
International Nuclear Information System (INIS)
Rajabalinejad, M.
2010-01-01
To reduce cost of Monte Carlo (MC) simulations for time-consuming processes, Bayesian Monte Carlo (BMC) is introduced in this paper. The BMC method reduces number of realizations in MC according to the desired accuracy level. BMC also provides a possibility of considering more priors. In other words, different priors can be integrated into one model by using BMC to further reduce cost of simulations. This study suggests speeding up the simulation process by considering the logical dependence of neighboring points as prior information. This information is used in the BMC method to produce a predictive tool through the simulation process. The general methodology and algorithm of BMC method are presented in this paper. The BMC method is applied to the simplified break water model as well as the finite element model of 17th Street Canal in New Orleans, and the results are compared with the MC and Dynamic Bounds methods.
Bayesian nonparametric hierarchical modeling.
Dunson, David B
2009-04-01
In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.
Bayesian analyses of seasonal runoff forecasts
Krzysztofowicz, R.; Reese, S.
1991-12-01
Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
Oliveira, Luis
2015-01-01
This book demonstrates how to design a wideband receiver operating in current mode, in which the noise and non-linearity are reduced, implemented in a low cost single chip, using standard CMOS technology. The authors present a solution to remove the transimpedance amplifier (TIA) block and connect directly the mixer’s output to a passive second-order continuous-time Σ∆ analog to digital converter (ADC), which operates in current-mode. These techniques enable the reduction of area, power consumption, and cost in modern CMOS receivers.
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Bayesian image restoration, using configurations
Thorarinsdottir, Thordis
2006-01-01
In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the re...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...
Bayesian tomographic reconstruction of microsystems
International Nuclear Information System (INIS)
Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali
2007-01-01
The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast).To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique.In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations
Bayesian nonparametric adaptive control using Gaussian processes.
Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A
2015-03-01
Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.
Fatti, Geoffrey; Jackson, Debra; Goga, Ameena E; Shaikh, Najma; Eley, Brian; Nachega, Jean B; Grimwood, Ashraf
2018-02-01
Adolescents and youth receiving antiretroviral treatment (ART) in sub-Saharan Africa have high attrition and inadequate ART outcomes, and evaluations of interventions improving ART outcomes amongst adolescents are very limited. Sustainable Development Goal (SDG) target 3c is to substantially increase the health workforce in developing countries. We measured the effectiveness and cost-effectiveness of community-based support (CBS) provided by lay health workers for adolescents and youth receiving ART in South Africa. A retrospective cohort study including adolescents and youth who initiated ART at 47 facilities. Previously unemployed CBS-workers provided home-based ART-related education, psychosocial support, symptom screening for opportunistic infections and support to access government grants. Outcomes were compared between participants who received CBS plus standard clinic-based care versus participants who received standard care only. Cumulative incidences of all-cause mortality and loss to follow-up (LTFU), adherence measured using medication possession ratios (MPRs), CD4 count slope, and virological suppression were analysed using multivariable Cox, competing-risks regression, generalized estimating equations and mixed-effects models over five years of ART. An expenditure approach was used to determine the incremental cost of CBS to usual care from a provider perspective. Incremental cost-effectiveness ratios were calculated as annual cost per patient-loss (through death or LTFU) averted. Amongst 6706 participants included, 2100 (31.3%) received CBS. Participants who received CBS had reduced mortality, adjusted hazard ratio (aHR) = 0.52 (95% CI: 0.37 to 0.73; p effectiveness of CBS in reducing attrition ranged from 42.2% after one year to 35.9% after five years. Virological suppression was similar after three years, but after five years 18.8% CBS participants versus 37.2% non-CBS participants failed to achieve viral suppression, adjusted odds ratio = 0
Bayesian nonparametric estimation of hazard rate in monotone Aalen model
Czech Academy of Sciences Publication Activity Database
Timková, Jana
2014-01-01
Roč. 50, č. 6 (2014), s. 849-868 ISSN 0023-5954 Institutional support: RVO:67985556 Keywords : Aalen model * Bayesian estimation * MCMC Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.541, year: 2014 http://library.utia.cas.cz/separaty/2014/SI/timkova-0438210.pdf
Bayesian seismic AVO inversion
Energy Technology Data Exchange (ETDEWEB)
Buland, Arild
2002-07-01
A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S
Cortical hierarchies perform Bayesian causal inference in multisensory perception.
Directory of Open Access Journals (Sweden)
Tim Rohe
2015-02-01
Full Text Available To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the "causal inference problem." Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI, and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation. At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion. Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.
Bayesian analogy with relational transformations.
Lu, Hongjing; Chen, Dawn; Holyoak, Keith J
2012-07-01
How can humans acquire relational representations that enable analogical inference and other forms of high-level reasoning? Using comparative relations as a model domain, we explore the possibility that bottom-up learning mechanisms applied to objects coded as feature vectors can yield representations of relations sufficient to solve analogy problems. We introduce Bayesian analogy with relational transformations (BART) and apply the model to the task of learning first-order comparative relations (e.g., larger, smaller, fiercer, meeker) from a set of animal pairs. Inputs are coded by vectors of continuous-valued features, based either on human magnitude ratings, normed feature ratings (De Deyne et al., 2008), or outputs of the topics model (Griffiths, Steyvers, & Tenenbaum, 2007). Bootstrapping from empirical priors, the model is able to induce first-order relations represented as probabilistic weight distributions, even when given positive examples only. These learned representations allow classification of novel instantiations of the relations and yield a symbolic distance effect of the sort obtained with both humans and other primates. BART then transforms its learned weight distributions by importance-guided mapping, thereby placing distinct dimensions into correspondence. These transformed representations allow BART to reliably solve 4-term analogies (e.g., larger:smaller::fiercer:meeker), a type of reasoning that is arguably specific to humans. Our results provide a proof-of-concept that structured analogies can be solved with representations induced from unstructured feature vectors by mechanisms that operate in a largely bottom-up fashion. We discuss potential implications for algorithmic and neural models of relational thinking, as well as for the evolution of abstract thought. Copyright 2012 APA, all rights reserved.
Bayesian networks improve causal environmental ...
Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value
Bayesian Latent Class Analysis Tutorial.
Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca
2018-01-01
This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.
Kernel Bayesian ART and ARTMAP.
Masuyama, Naoki; Loo, Chu Kiong; Dawood, Farhan
2018-02-01
Adaptive Resonance Theory (ART) is one of the successful approaches to resolving "the plasticity-stability dilemma" in neural networks, and its supervised learning model called ARTMAP is a powerful tool for classification. Among several improvements, such as Fuzzy or Gaussian based models, the state of art model is Bayesian based one, while solving the drawbacks of others. However, it is known that the Bayesian approach for the high dimensional and a large number of data requires high computational cost, and the covariance matrix in likelihood becomes unstable. This paper introduces Kernel Bayesian ART (KBA) and ARTMAP (KBAM) by integrating Kernel Bayes' Rule (KBR) and Correntropy Induced Metric (CIM) to Bayesian ART (BA) and ARTMAP (BAM), respectively, while maintaining the properties of BA and BAM. The kernel frameworks in KBA and KBAM are able to avoid the curse of dimensionality. In addition, the covariance-free Bayesian computation by KBR provides the efficient and stable computational capability to KBA and KBAM. Furthermore, Correntropy-based similarity measurement allows improving the noise reduction ability even in the high dimensional space. The simulation experiments show that KBA performs an outstanding self-organizing capability than BA, and KBAM provides the superior classification ability than BAM, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Interactive Instruction in Bayesian Inference
DEFF Research Database (Denmark)
Khan, Azam; Breslav, Simon; Hornbæk, Kasper
2018-01-01
An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction. These pri......An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction....... These principles concern coherence, personalization, signaling, segmenting, multimedia, spatial contiguity, and pretraining. Principles of self-explanation and interactivity are also applied. Four experiments on the Mammography Problem showed that these principles help participants answer the questions...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Bayesian analysis of CCDM models
Jesus, J. F.; Valentim, R.; Andrade-Oliveira, F.
2017-09-01
Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3αH0 model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.
Bayesian analysis of CCDM models
Energy Technology Data Exchange (ETDEWEB)
Jesus, J.F. [Universidade Estadual Paulista (Unesp), Câmpus Experimental de Itapeva, Rua Geraldo Alckmin 519, Vila N. Sra. de Fátima, Itapeva, SP, 18409-010 Brazil (Brazil); Valentim, R. [Departamento de Física, Instituto de Ciências Ambientais, Químicas e Farmacêuticas—ICAQF, Universidade Federal de São Paulo (UNIFESP), Unidade José Alencar, Rua São Nicolau No. 210, Diadema, SP, 09913-030 Brazil (Brazil); Andrade-Oliveira, F., E-mail: jfjesus@itapeva.unesp.br, E-mail: valentim.rodolfo@unifesp.br, E-mail: felipe.oliveira@port.ac.uk [Institute of Cosmology and Gravitation—University of Portsmouth, Burnaby Road, Portsmouth, PO1 3FX United Kingdom (United Kingdom)
2017-09-01
Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3α H {sub 0} model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.
Bayesian Network Induction via Local Neighborhoods
National Research Council Canada - National Science Library
Margaritis, Dimitris
1999-01-01
.... We present an efficient algorithm for learning Bayesian networks from data. Our approach constructs Bayesian networks by first identifying each node's Markov blankets, then connecting nodes in a consistent way...
Can a significance test be genuinely Bayesian?
Pereira, Carlos A. de B.; Stern, Julio Michael; Wechsler, Sergio
2008-01-01
The Full Bayesian Significance Test, FBST, is extensively reviewed. Its test statistic, a genuine Bayesian measure of evidence, is discussed in detail. Its behavior in some problems of statistical inference like testing for independence in contingency tables is discussed.
Bayesian modeling using WinBUGS
Ntzoufras, Ioannis
2009-01-01
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...
Inference in hybrid Bayesian networks
International Nuclear Information System (INIS)
Langseth, Helge; Nielsen, Thomas D.; Rumi, Rafael; Salmeron, Antonio
2009-01-01
Since the 1980s, Bayesian networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability techniques (like fault trees and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (the so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability.
3D Bayesian contextual classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
2000-01-01
We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....
Bayesian methods for proteomic biomarker development
Directory of Open Access Journals (Sweden)
Belinda Hernández
2015-12-01
In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.
Marazìa, Stefania; Barnabei, Luca; De Caterina, Raffaele
2008-01-01
A common problem in diagnostic medicine, when performing a diagnostic test, is to obtain an accurate discrimination between 'normal' cases and cases with disease, owing to the overlapping distributions of these populations. In clinical practice, it is exceedingly rare that a chosen cut point will achieve perfect discrimination between normal cases and those with disease, and one has to select the best compromise between sensitivity and specificity by comparing the diagnostic performance of different tests or diagnostic criteria available. Receiver operating characteristic (or receiver operator characteristic, ROC) curves allow systematic and intuitively appealing descriptions of the diagnostic performance of a test and a comparison of the performance of different tests or diagnostic criteria. This review will analyse the basic principles underlying ROC curves and their specific application to the choice of optimal parameters on exercise electrocardiographic stress testing. Part II will be devoted to the comparative analysis of various parameters derived from exercise stress testing for the diagnosis of underlying coronary artery disease.
Bayesian networks and food security - An introduction
Stein, A.
2004-01-01
This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision
Plug & Play object oriented Bayesian networks
DEFF Research Database (Denmark)
Bangsø, Olav; Flores, J.; Jensen, Finn Verner
2003-01-01
been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... dynamic domains. The communication needed between instances is achieved by means of a fill-in propagation scheme....
A Bayesian framework for risk perception
van Erp, H.R.N.
2017-01-01
We present here a Bayesian framework of risk perception. This framework encompasses plausibility judgments, decision making, and question asking. Plausibility judgments are modeled by way of Bayesian probability theory, decision making is modeled by way of a Bayesian decision theory, and relevancy
Probabilistic Safety Analysis of High Speed and Conventional Lines Using Bayesian Networks
Energy Technology Data Exchange (ETDEWEB)
Grande Andrade, Z.; Castillo Ron, E.; O' Connor, A.; Nogal, M.
2016-07-01
A Bayesian network approach is presented for probabilistic safety analysis (PSA) of railway lines. The idea consists of identifying and reproducing all the elements that the train encounters when circulating along a railway line, such as light and speed limit signals, tunnel or viaduct entries or exits, cuttings and embankments, acoustic sounds received in the cabin, curves, switches, etc. In addition, since the human error is very relevant for safety evaluation, the automatic train protection (ATP) systems and the driver behavior and its time evolution are modelled and taken into account to determine the probabilities of human errors. The nodes of the Bayesian network, their links and the associated probability tables are automatically constructed based on the line data that need to be carefully given. The conditional probability tables are reproduced by closed formulas, which facilitate the modelling and the sensitivity analysis. A sorted list of the most dangerous elements in the line is obtained, which permits making decisions about the line safety and programming maintenance operations in order to optimize them and reduce the maintenance costs substantially. The proposed methodology is illustrated by its application to several cases that include real lines such as the Palencia-Santander and the Dublin-Belfast lines. (Author)
An Advanced Bayesian Method for Short-Term Probabilistic Forecasting of the Generation of Wind Power
Directory of Open Access Journals (Sweden)
Antonio Bracale
2015-09-01
Full Text Available Currently, among renewable distributed generation systems, wind generators are receiving a great deal of interest due to the great economic, technological, and environmental incentives they involve. However, the uncertainties due to the intermittent nature of wind energy make it difficult to operate electrical power systems optimally and make decisions that satisfy the needs of all the stakeholders of the electricity energy market. Thus, there is increasing interest determining how to forecast wind power production accurately. Most the methods that have been published in the relevant literature provided deterministic forecasts even though great interest has been focused recently on probabilistic forecast methods. In this paper, an advanced probabilistic method is proposed for short-term forecasting of wind power production. A mixture of two Weibull distributions was used as a probability function to model the uncertainties associated with wind speed. Then, a Bayesian inference approach with a particularly-effective, autoregressive, integrated, moving-average model was used to determine the parameters of the mixture Weibull distribution. Numerical applications also are presented to provide evidence of the forecasting performance of the Bayesian-based approach.
Bayesian Statistics: Concepts and Applications in Animal Breeding – A Review
Directory of Open Access Journals (Sweden)
Lsxmikant-Sambhaji Kokate
2011-07-01
Full Text Available Statistics uses two major approaches- conventional (or frequentist and Bayesian approach. Bayesian approach provides a complete paradigm for both statistical inference and decision making under uncertainty. Bayesian methods solve many of the difficulties faced by conventional statistical methods, and extend the applicability of statistical methods. It exploits the use of probabilistic models to formulate scientific problems. To use Bayesian statistics, there is computational difficulty and secondly, Bayesian methods require specifying prior probability distributions. Markov Chain Monte-Carlo (MCMC methods were applied to overcome the computational difficulty, and interest in Bayesian methods was renewed. In Bayesian statistics, Bayesian structural equation model (SEM is used. It provides a powerful and flexible approach for studying quantitative traits for wide spectrum problems and thus it has no operational difficulties, with the exception of some complex cases. In this method, the problems are solved at ease, and the statisticians feel it comfortable with the particular way of expressing the results and employing the software available to analyze a large variety of problems.
Probabilistic Space Weather Forecasting: a Bayesian Perspective
Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.
2017-12-01
Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.
Bayesian NL interpretation and learning
Zeevat, H.
2011-01-01
Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis Linda
2006-01-01
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...
Differentiated Bayesian Conjoint Choice Designs
Z. Sándor (Zsolt); M. Wedel (Michel)
2003-01-01
textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...
Bayesian Sampling using Condition Indicators
DEFF Research Database (Denmark)
Faber, Michael H.; Sørensen, John Dalsgaard
2002-01-01
of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...
Bayesian Classification of Image Structures
DEFF Research Database (Denmark)
Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert
2009-01-01
In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...
Bayesian estimates of linkage disequilibrium
Directory of Open Access Journals (Sweden)
Abad-Grau María M
2007-06-01
Full Text Available Abstract Background The maximum likelihood estimator of D' – a standard measure of linkage disequilibrium – is biased toward disequilibrium, and the bias is particularly evident in small samples and rare haplotypes. Results This paper proposes a Bayesian estimation of D' to address this problem. The reduction of the bias is achieved by using a prior distribution on the pair-wise associations between single nucleotide polymorphisms (SNPs that increases the likelihood of equilibrium with increasing physical distances between pairs of SNPs. We show how to compute the Bayesian estimate using a stochastic estimation based on MCMC methods, and also propose a numerical approximation to the Bayesian estimates that can be used to estimate patterns of LD in large datasets of SNPs. Conclusion Our Bayesian estimator of D' corrects the bias toward disequilibrium that affects the maximum likelihood estimator. A consequence of this feature is a more objective view about the extent of linkage disequilibrium in the human genome, and a more realistic number of tagging SNPs to fully exploit the power of genome wide association studies.
3-D contextual Bayesian classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...
SU-F-R-44: Modeling Lung SBRT Tumor Response Using Bayesian Network Averaging
International Nuclear Information System (INIS)
Diamant, A; Ybarra, N; Seuntjens, J; El Naqa, I
2016-01-01
Purpose: The prediction of tumor control after a patient receives lung SBRT (stereotactic body radiation therapy) has proven to be challenging, due to the complex interactions between an individual’s biology and dose-volume metrics. Many of these variables have predictive power when combined, a feature that we exploit using a graph modeling approach based on Bayesian networks. This provides a probabilistic framework that allows for accurate and visually intuitive predictive modeling. The aim of this study is to uncover possible interactions between an individual patient’s characteristics and generate a robust model capable of predicting said patient’s treatment outcome. Methods: We investigated a cohort of 32 prospective patients from multiple institutions whom had received curative SBRT to the lung. The number of patients exhibiting tumor failure was observed to be 7 (event rate of 22%). The serum concentration of 5 biomarkers previously associated with NSCLC (non-small cell lung cancer) was measured pre-treatment. A total of 21 variables were analyzed including: dose-volume metrics with BED (biologically effective dose) correction and clinical variables. A Markov Chain Monte Carlo technique estimated the posterior probability distribution of the potential graphical structures. The probability of tumor failure was then estimated by averaging the top 100 graphs and applying Baye’s rule. Results: The optimal Bayesian model generated throughout this study incorporated the PTV volume, the serum concentration of the biomarker EGFR (epidermal growth factor receptor) and prescription BED. This predictive model recorded an area under the receiver operating characteristic curve of 0.94(1), providing better performance compared to competing methods in other literature. Conclusion: The use of biomarkers in conjunction with dose-volume metrics allows for the generation of a robust predictive model. The preliminary results of this report demonstrate that it is possible
SU-F-R-44: Modeling Lung SBRT Tumor Response Using Bayesian Network Averaging
Energy Technology Data Exchange (ETDEWEB)
Diamant, A; Ybarra, N; Seuntjens, J [McGill University, Montreal, Quebec (Canada); El Naqa, I [University of Michigan, Ann Arbor, MI (United States)
2016-06-15
Purpose: The prediction of tumor control after a patient receives lung SBRT (stereotactic body radiation therapy) has proven to be challenging, due to the complex interactions between an individual’s biology and dose-volume metrics. Many of these variables have predictive power when combined, a feature that we exploit using a graph modeling approach based on Bayesian networks. This provides a probabilistic framework that allows for accurate and visually intuitive predictive modeling. The aim of this study is to uncover possible interactions between an individual patient’s characteristics and generate a robust model capable of predicting said patient’s treatment outcome. Methods: We investigated a cohort of 32 prospective patients from multiple institutions whom had received curative SBRT to the lung. The number of patients exhibiting tumor failure was observed to be 7 (event rate of 22%). The serum concentration of 5 biomarkers previously associated with NSCLC (non-small cell lung cancer) was measured pre-treatment. A total of 21 variables were analyzed including: dose-volume metrics with BED (biologically effective dose) correction and clinical variables. A Markov Chain Monte Carlo technique estimated the posterior probability distribution of the potential graphical structures. The probability of tumor failure was then estimated by averaging the top 100 graphs and applying Baye’s rule. Results: The optimal Bayesian model generated throughout this study incorporated the PTV volume, the serum concentration of the biomarker EGFR (epidermal growth factor receptor) and prescription BED. This predictive model recorded an area under the receiver operating characteristic curve of 0.94(1), providing better performance compared to competing methods in other literature. Conclusion: The use of biomarkers in conjunction with dose-volume metrics allows for the generation of a robust predictive model. The preliminary results of this report demonstrate that it is possible
Bayesian Alternation During Tactile Augmentation
Directory of Open Access Journals (Sweden)
Caspar Mathias Goeke
2016-10-01
Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in
Nonlinear Bayesian filtering and learning: a neuronal dynamics for perception.
Kutschireiter, Anna; Surace, Simone Carlo; Sprekeler, Henning; Pfister, Jean-Pascal
2017-08-18
The robust estimation of dynamical hidden features, such as the position of prey, based on sensory inputs is one of the hallmarks of perception. This dynamical estimation can be rigorously formulated by nonlinear Bayesian filtering theory. Recent experimental and behavioral studies have shown that animals' performance in many tasks is consistent with such a Bayesian statistical interpretation. However, it is presently unclear how a nonlinear Bayesian filter can be efficiently implemented in a network of neurons that satisfies some minimum constraints of biological plausibility. Here, we propose the Neural Particle Filter (NPF), a sampling-based nonlinear Bayesian filter, which does not rely on importance weights. We show that this filter can be interpreted as the neuronal dynamics of a recurrently connected rate-based neural network receiving feed-forward input from sensory neurons. Further, it captures properties of temporal and multi-sensory integration that are crucial for perception, and it allows for online parameter learning with a maximum likelihood approach. The NPF holds the promise to avoid the 'curse of dimensionality', and we demonstrate numerically its capability to outperform weighted particle filters in higher dimensions and when the number of particles is limited.
Bayesian Inference and Online Learning in Poisson Neuronal Networks.
Huang, Yanping; Rao, Rajesh P N
2016-08-01
Motivated by the growing evidence for Bayesian computation in the brain, we show how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model. The lower-layer sensory neurons receive noisy measurements of hidden world states. The higher-layer neurons infer a posterior distribution over world states via Bayesian inference from inputs generated by sensory neurons. We demonstrate how such a neuronal network with synaptic plasticity can implement a form of Bayesian inference similar to Monte Carlo methods such as particle filtering. Each spike in a higher-layer neuron represents a sample of a particular hidden world state. The spiking activity across the neural population approximates the posterior distribution over hidden states. In this model, variability in spiking is regarded not as a nuisance but as an integral feature that provides the variability necessary for sampling during inference. We demonstrate how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule. We present results illustrating the ability of the network to perform inference and learning for arbitrary hidden Markov models.
Topics in Bayesian statistics and maximum entropy
International Nuclear Information System (INIS)
Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.
1998-12-01
Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)
Bayesian analysis of rare events
Energy Technology Data Exchange (ETDEWEB)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Directory of Open Access Journals (Sweden)
Hashem Salarzadeh Jenatabadi
2016-11-01
Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.
Minimum mean square error estimation and approximation of the Bayesian update
Litvinenko, Alexander; Matthies, Hermann G.; Zander, Elmar
2015-01-01
Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.
Bayesian Networks as a Decision Tool for O&M of Offshore Wind Turbines
DEFF Research Database (Denmark)
Nielsen, Jannie Jessen; Sørensen, John Dalsgaard
2010-01-01
Costs to operation and maintenance (O&M) of offshore wind turbines are large. This paper presents how influence diagrams can be used to assist in rational decision making for O&M. An influence diagram is a graphical representation of a decision tree based on Bayesian Networks. Bayesian Networks...... offer efficient Bayesian updating of a damage model when imperfect information from inspections/monitoring is available. The extension to an influence diagram offers the calculation of expected utilities for decision alternatives, and can be used to find the optimal strategy among different alternatives...
Minimum mean square error estimation and approximation of the Bayesian update
Litvinenko, Alexander
2015-01-07
Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.
A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks
Directory of Open Access Journals (Sweden)
Sho Fukuda
2014-12-01
Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks
Learning Bayesian Dependence Model for Student Modelling
Directory of Open Access Journals (Sweden)
Adina COCU
2008-12-01
Full Text Available Learning a Bayesian network from a numeric set of data is a challenging task because of dual nature of learning process: initial need to learn network structure, and then to find out the distribution probability tables. In this paper, we propose a machine-learning algorithm based on hill climbing search combined with Tabu list. The aim of learning process is to discover the best network that represents dependences between nodes. Another issue in machine learning procedure is handling numeric attributes. In order to do that, we must perform an attribute discretization pre-processes. This discretization operation can influence the results of learning network structure. Therefore, we make a comparative study to find out the most suitable combination between discretization method and learning algorithm, for a specific data set.
Variational Bayesian Inference of Line Spectra
DEFF Research Database (Denmark)
Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri
2017-01-01
parameters. We propose an accurate representation of the pdfs of the frequencies by mixtures of von Mises pdfs, which yields closed-form expectations. We define the algorithm VALSE in which the estimates of the pdfs and parameters are iteratively updated. VALSE is a gridless, convergent method, does......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs......) of the frequencies and computing expectations over them. Thus, we additionally capture and operate with the uncertainty of the frequency estimates. Aiming to maximize the model evidence, variational optimization provides analytic approximations of the posterior pdfs and also gives estimates of the additional...
General and Local: Averaged k-Dependence Bayesian Classifiers
Directory of Open Access Journals (Sweden)
Limin Wang
2015-06-01
Full Text Available The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB classifier can construct at arbitrary points (values of k along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB, tree augmented naive Bayes (TAN, Averaged one-dependence estimators (AODE, and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance.
Bayesian logistic regression approaches to predict incorrect DRG assignment.
Suleiman, Mani; Demirhan, Haydar; Boyd, Leanne; Girosi, Federico; Aksakalli, Vural
2018-05-07
Episodes of care involving similar diagnoses and treatments and requiring similar levels of resource utilisation are grouped to the same Diagnosis-Related Group (DRG). In jurisdictions which implement DRG based payment systems, DRGs are a major determinant of funding for inpatient care. Hence, service providers often dedicate auditing staff to the task of checking that episodes have been coded to the correct DRG. The use of statistical models to estimate an episode's probability of DRG error can significantly improve the efficiency of clinical coding audits. This study implements Bayesian logistic regression models with weakly informative prior distributions to estimate the likelihood that episodes require a DRG revision, comparing these models with each other and to classical maximum likelihood estimates. All Bayesian approaches had more stable model parameters than maximum likelihood. The best performing Bayesian model improved overall classification per- formance by 6% compared to maximum likelihood, with a 34% gain compared to random classification, respectively. We found that the original DRG, coder and the day of coding all have a significant effect on the likelihood of DRG error. Use of Bayesian approaches has improved model parameter stability and classification accuracy. This method has already lead to improved audit efficiency in an operational capacity.
Bayesian inference of chemical kinetic models from proposed reactions
Galagali, Nikhil
2015-02-01
© 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.
Development of dynamic Bayesian models for web application test management
Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.
2018-03-01
The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.
Bayesian estimation methods in metrology
International Nuclear Information System (INIS)
Cox, M.G.; Forbes, A.B.; Harris, P.M.
2004-01-01
In metrology -- the science of measurement -- a measurement result must be accompanied by a statement of its associated uncertainty. The degree of validity of a measurement result is determined by the validity of the uncertainty statement. In recognition of the importance of uncertainty evaluation, the International Standardization Organization in 1995 published the Guide to the Expression of Uncertainty in Measurement and the Guide has been widely adopted. The validity of uncertainty statements is tested in interlaboratory comparisons in which an artefact is measured by a number of laboratories and their measurement results compared. Since the introduction of the Mutual Recognition Arrangement, key comparisons are being undertaken to determine the degree of equivalence of laboratories for particular measurement tasks. In this paper, we discuss the possible development of the Guide to reflect Bayesian approaches and the evaluation of key comparison data using Bayesian estimation methods
Deep Learning and Bayesian Methods
Directory of Open Access Journals (Sweden)
Prosper Harrison B.
2017-01-01
Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.
Bayesian inference on proportional elections.
Directory of Open Access Journals (Sweden)
Gabriel Hideki Vatanabe Brunello
Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.
BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS
Directory of Open Access Journals (Sweden)
Thordis Linda Thorarinsdottir
2011-05-01
Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.
Electronic warfare receivers and receiving systems
Poisel, Richard A
2014-01-01
Receivers systems are considered the core of electronic warfare (EW) intercept systems. Without them, the fundamental purpose of such systems is null and void. This book considers the major elements that make up receiver systems and the receivers that go in them.This resource provides system design engineers with techniques for design and development of EW receivers for modern modulations (spread spectrum) in addition to receivers for older, common modulation formats. Each major module in these receivers is considered in detail. Design information is included as well as performance tradeoffs o
Multiview Bayesian Correlated Component Analysis
DEFF Research Database (Denmark)
Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai
2015-01-01
are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....
Target distribution in cooperative combat based on Bayesian optimization algorithm
Institute of Scientific and Technical Information of China (English)
Shi Zhifu; Zhang An; Wang Anli
2006-01-01
Target distribution in cooperative combat is a difficult and emphases. We build up the optimization model according to the rule of fire distribution. We have researched on the optimization model with BOA. The BOA can estimate the joint probability distribution of the variables with Bayesian network, and the new candidate solutions also can be generated by the joint distribution. The simulation example verified that the method could be used to solve the complex question, the operation was quickly and the solution was best.
Bayesian phylogeny analysis via stochastic approximation Monte Carlo
Cheon, Sooyoung
2009-11-01
Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.
A Bayesian Classifier for X-Ray Pulsars Recognition
Directory of Open Access Journals (Sweden)
Hao Liang
2016-01-01
Full Text Available Recognition for X-ray pulsars is important for the problem of spacecraft’s attitude determination by X-ray Pulsar Navigation (XPNAV. By using the nonhomogeneous Poisson model of the received photons and the minimum recognition error criterion, a classifier based on the Bayesian theorem is proposed. For X-ray pulsars recognition with unknown Doppler frequency and initial phase, the features of every X-ray pulsar are extracted and the unknown parameters are estimated using the Maximum Likelihood (ML method. Besides that, a method to recognize unknown X-ray pulsars or X-ray disturbances is proposed. Simulation results certificate the validity of the proposed Bayesian classifier.
12th Brazilian Meeting on Bayesian Statistics
Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo
2015-01-01
Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...
A Bayesian model for binary Markov chains
Directory of Open Access Journals (Sweden)
Belkheir Essebbar
2004-02-01
Full Text Available This note is concerned with Bayesian estimation of the transition probabilities of a binary Markov chain observed from heterogeneous individuals. The model is founded on the Jeffreys' prior which allows for transition probabilities to be correlated. The Bayesian estimator is approximated by means of Monte Carlo Markov chain (MCMC techniques. The performance of the Bayesian estimates is illustrated by analyzing a small simulated data set.
Bayesian calibration of simultaneity in audiovisual temporal order judgments.
Directory of Open Access Journals (Sweden)
Shinya Yamamoto
Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.
Barnabei, Luca; Marazìa, Stefania; De Caterina, Raffaele
2007-11-01
A common problem in diagnostic medicine, when performing a diagnostic test, is to obtain an accurate discrimination between 'normal' cases and cases with disease, owing to the overlapping distributions of these populations. In clinical practice, it is exceedingly rare that a chosen cut point will achieve perfect discrimination between normal cases and those with disease, and one has to select the best compromise between sensitivity and specificity by comparing the diagnostic performance of different tests or diagnostic criteria available. Receiver operating characteristic (or receiver operator characteristic, ROC) curves allow systematic and intuitively appealing descriptions of the diagnostic performance of a test and a comparison of the performance of different tests or diagnostic criteria. This review will analyse the basic principles underlying ROC curves and their specific application to the choice of optimal parameters on exercise electrocardiographic (ECG) stress testing. Part I will focus on theoretical description and analysis along with reviewing the common problems related to the diagnosis of myocardial ischaemia by means of exercise ECG stress testing. Part II will be devoted to applying ROC curves to available diagnostic criteria through the analysis of ECG stress test parameters.
3rd Bayesian Young Statisticians Meeting
Lanzarone, Ettore; Villalobos, Isadora; Mattei, Alessandra
2017-01-01
This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).
International Nuclear Information System (INIS)
Montani, S.; Portinale, L.; Bobbio, A.; Codetta-Raiteri, D.
2008-01-01
In this paper, we present RADYBAN (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze a dynamic fault tree relying on its conversion into a dynamic Bayesian network. The tool implements a modular algorithm for automatically translating a dynamic fault tree into the corresponding dynamic Bayesian network and exploits classical algorithms for the inference on dynamic Bayesian networks, in order to compute reliability measures. After having described the basic features of the tool, we show how it operates on a real world example and we compare the unreliability results it generates with those returned by other methodologies, in order to verify the correctness and the consistency of the results obtained
Bayesian phylogeography finds its roots.
Directory of Open Access Journals (Sweden)
Philippe Lemey
2009-09-01
Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.
Bayesian flood forecasting methods: A review
Han, Shasha; Coulibaly, Paulin
2017-08-01
Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
2013-01-01
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Numeracy, frequency, and Bayesian reasoning
Directory of Open Access Journals (Sweden)
Gretchen B. Chapman
2009-02-01
Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.
Customizable Digital Receivers for Radar
Moller, Delwyn; Heavey, Brandon; Sadowy, Gregory
2008-01-01
Compact, highly customizable digital receivers are being developed for the system described in 'Radar Interferometer for Topographic Mapping of Glaciers and Ice Sheets' (NPO-43962), NASA Tech Briefs, Vol. 31, No. 7 (August 2007), page 72. The receivers are required to operate in unison, sampling radar returns received by the antenna elements in a digital beam-forming (DBF) mode. The design of these receivers could also be adapted to commercial radar systems. At the time of reporting the information for this article, there were no commercially available digital receivers capable of satisfying all of the operational requirements and compact enough to be mounted directly on the antenna elements. A provided figure depicts the overall system of which the digital receivers are parts. Each digital receiver includes an analog-to-digital converter (ADC), a demultiplexer (DMUX), and a field-programmable gate array (FPGA). The ADC effects 10-bit band-pass sampling of input signals having frequencies up to 3.5 GHz. The input samples are demultiplexed at a user-selectable rate of 1:2 or 1:4, then buffered in part of the FPGA that functions as a first-in/first-out (FIFO) memory. Another part of the FPGA serves as a controller for the ADC, DMUX, and FIFO memory and as an interface between (1) the rest of the receiver and (2) a front-panel data port (FPDP) bus, which is an industry-standard parallel data bus that has a high data-rate capability and multichannel configuration suitable for DBF. Still other parts of the FPGA in each receiver perform signal-processing functions. The digital receivers can be configured to operate in a stand-alone mode, or in a multichannel mode as needed for DBF. The customizability of the receiver makes it applicable to a broad range of system architectures. The capability for operation of receivers in either a stand-alone or a DBF mode enables the use of the receivers in an unprecedentedly wide variety of radar systems.
Receiver-exciter controller design
Jansma, P. A.
1982-01-01
A description of the general design of both the block 3 and block 4 receiver-exciter controllers for the Deep Space Network (DSN) Mark IV-A System is presented along with the design approach. The controllers are designed to enable the receiver-exciter subsystem (RCV) to be configured, calibrated, initialized and operated from a central location via high level instructions. The RECs are designed to be operated under the control of the DMC subsystem. The instructions are in the form of standard subsystem blocks (SSBs) received via the local area network (LAN). The centralized control provided by RECs and other DSCC controllers in Mark IV-A is intended to reduce DSN operations costs from the Mark III era.
Bayesian analysis of magnetic island dynamics
International Nuclear Information System (INIS)
Preuss, R.; Maraschek, M.; Zohm, H.; Dose, V.
2003-01-01
We examine a first order differential equation with respect to time used to describe magnetic islands in magnetically confined plasmas. The free parameters of this equation are obtained by employing Bayesian probability theory. Additionally, a typical Bayesian change point is solved in the process of obtaining the data
Learning dynamic Bayesian networks with mixed variables
DEFF Research Database (Denmark)
Bøttcher, Susanne Gammelgaard
This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned...
Using Bayesian Networks to Improve Knowledge Assessment
Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra
2013-01-01
In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…
Using Bayesian belief networks in adaptive management.
J.B. Nyberg; B.G. Marcot; R. Sulyma
2006-01-01
Bayesian belief and decision networks are relatively new modeling methods that are especially well suited to adaptive-management applications, but they appear not to have been widely used in adaptive management to date. Bayesian belief networks (BBNs) can serve many purposes for practioners of adaptive management, from illustrating system relations conceptually to...
Bayesian Decision Theoretical Framework for Clustering
Chen, Mo
2011-01-01
In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…
Robust Bayesian detection of unmodelled bursts
International Nuclear Information System (INIS)
Searle, Antony C; Sutton, Patrick J; Tinto, Massimo; Woan, Graham
2008-01-01
We develop a Bayesian treatment of the problem of detecting unmodelled gravitational wave bursts using the new global network of interferometric detectors. We also compare this Bayesian treatment with existing coherent methods, and demonstrate that the existing methods make implicit assumptions on the distribution of signals that make them sub-optimal for realistic signal populations
Bayesian models: A statistical primer for ecologists
Hobbs, N. Thompson; Hooten, Mevin B.
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models
Particle identification in ALICE: a Bayesian approach
Adam, J.; Adamova, D.; Aggarwal, M. M.; Rinella, G. Aglieri; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Anticic, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshaeuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badala, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnafoeldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Bathen, B.; Batigne, G.; Camejo, A. Batista; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielcik, J.; Bielcikova, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boggild, H.; Boldizsar, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossu, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Diaz, L. Calero; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castellanos, J. Castillo; Castro, A. J.; Casula, E. A. R.; Sanchez, C. Ceballos; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Barroso, V. Chibante; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Balbastre, G. Conesa; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Morales, Y. Corrales; Cortes Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Denes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divia, R.; Djuvsland, O.; Dobrin, A.; Gimenez, D. Domenicis; Doenigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernandez Tellez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhoje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glaessel, P.; Gomez Coral, D. M.; Ramirez, A. Gomez; Gonzalez, A. S.; Gonzalez, V.; Gonzalez-Zamora, P.; Gorbunov, S.; Goerlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Haake, R.; Haaland, O.; Hadjidakis, C.; Haiduc, M.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbaer, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Horak, D.; Hosokawa, R.; Hristov, P.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Incani, E.; Ippolitov, M.; Irfan, M.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacazio, N.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Jang, H. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Bustamante, R. T. Jimenez; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kamin, J.; Kaplin, V.; Kar, S.; Uysal, A. Karasu; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Khan, M. Mohisin; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, J. S.; Kim, M.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein-Boesing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kostarakis, P.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Meethaleveedu, G. Koyithatta; Kralik, I.; Kravcakova, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kucera, V.; Kuijer, P. G.; Kumar, J.; Kumar, L.; Kumar, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Ladron de Guevara, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lea, R.; Leardini, L.; Lee, G. R.; Lee, S.; Lehas, F.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; Monzon, I. Leon; Leon Vargas, H.; Leoncino, M.; Levai, P.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; Torres, E. Lopez; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Marchisone, M.; Mares, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marin, A.; Markert, C.; Marquard, M.; Martin, N. A.; Blanco, J. Martin; Martinengo, P.; Martinez, M. I.; Garcia, G. Martinez; Pedreira, M. Martinez; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzoni, M. A.; Mcdonald, D.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Perez, J. Mercado; Meres, M.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Miskowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montano Zetina, L.; Montes, E.; De Godoy, D. A. Moreira; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Muehlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Oh, S. K.; Ohlson, A.; Okatan, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pagano, D.; Pagano, P.; Paic, G.; Pal, S. K.; Pan, J.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Da Costa, H. Pereira; Peresunko, D.; Lara, C. E. Perez; Lezama, E. Perez; Peskov, V.; Pestov, Y.; Petracek, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Ploskon, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Raniwala, R.; Raniwala, S.; Raesaenen, S. S.; Rascanu, B. T.; Rathee, D.; Read, K. F.; Redlich, K.; Reed, R. J.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rocco, E.; Rodriguez Cahuantzi, M.; Manso, A. Rodriguez; Roed, K.; Rogochaya, E.; Rohr, D.; Roehrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Montero, A. J. Rubio; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Safarik, K.; Sahlmuller, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandor, L.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Scapparone, E.; Scarlassara, F.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schuchmann, S.; Schukraft, J.; Schulc, M.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Sefcik, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shahzad, M. I.; Shangaraev, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singha, S.; Singhal, V.; Sinha, B. C.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; de Souza, R. D.; Sozzi, F.; Spacek, M.; Spiriti, E.; Sputowska, I.; Spyropoulou-Stassinaki, M.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Sumbera, M.; Sumowidagdo, S.; Szabo, A.; Szanto de Toledo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Munoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thaeder, J.; Thakur, D.; Thomas, D.; Tieulent, R.; Timmins, A. R.; Toia, A.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Palomo, L. Valencia; Vallero, S.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vyvre, P. Vande; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vechernin, V.; Veen, A. M.; Veldhoen, M.; Velure, A.; Vercellin, E.; Vergara Limon, S.; Vernet, R.; Verweij, M.; Vickovic, L.; Viesti, G.; Viinikainen, J.; Vilakazi, Z.; Baillie, O. Villalobos; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Vinogradov, Y.; Virgili, T.; Vislavicius, V.; Viyogi, Y. P.; Vodopyanov, A.; Voelkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Vranic, D.; Vrlakova, J.; Vulpescu, B.; Wagner, B.; Wagner, J.; Wang, H.; Watanabe, D.; Watanabe, Y.; Weiser, D. F.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Williams, M. C. S.; Windelband, B.; Winn, M.; Yang, H.; Yano, S.; Yasin, Z.; Yokoyama, H.; Yoo, I. -K.; Yoon, J. H.; Yurchenko, V.; Yushmanov, I.; Zaborowska, A.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Zavada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zgura, I. S.; Zhalov, M.; Zhang, C.; Zhao, C.; Zhigareva, N.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zyzak, M.
2016-01-01
We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian
Advances in Bayesian Modeling in Educational Research
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark
2006-01-01
We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan
2004-01-01
We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...
Marcin, Martin; Abramovici, Alexander
2008-01-01
The software of a commercially available digital radio receiver has been modified to make the receiver function as a two-channel low-noise phase meter. This phase meter is a prototype in the continuing development of a phase meter for a system in which radiofrequency (RF) signals in the two channels would be outputs of a spaceborne heterodyne laser interferometer for detecting gravitational waves. The frequencies of the signals could include a common Doppler-shift component of as much as 15 MHz. The phase meter is required to measure the relative phases of the signals in the two channels at a sampling rate of 10 Hz at a root power spectral density measurements in laser metrology of moving bodies. To illustrate part of the principle of operation of the phase meter, the figure includes a simplified block diagram of a basic singlechannel digital receiver. The input RF signal is first fed to the input terminal of an analog-to-digital converter (ADC). To prevent aliasing errors in the ADC, the sampling rate must be at least twice the input signal frequency. The sampling rate of the ADC is governed by a sampling clock, which also drives a digital local oscillator (DLO), which is a direct digital frequency synthesizer. The DLO produces samples of sine and cosine signals at a programmed tuning frequency. The sine and cosine samples are mixed with (that is, multiplied by) the samples from the ADC, then low-pass filtered to obtain in-phase (I) and quadrature (Q) signal components. A digital signal processor (DSP) computes the ratio between the Q and I components, computes the phase of the RF signal (relative to that of the DLO signal) as the arctangent of this ratio, and then averages successive such phase values over a time interval specified by the user.
Receiver Test Selection Criteria
2015-03-12
The DOT requests that GPS manufacturers submit receivers for test in the following TWG categories: - Aviation (non-certified), cellular, general location/navigation, high precision, timing, networks, and space-based receivers - Each receiver should b...
BELM: Bayesian extreme learning machine.
Soria-Olivas, Emilio; Gómez-Sanchis, Juan; Martín, José D; Vila-Francés, Joan; Martínez, Marcelino; Magdalena, José R; Serrano, Antonio J
2011-03-01
The theory of extreme learning machine (ELM) has become very popular on the last few years. ELM is a new approach for learning the parameters of the hidden layers of a multilayer neural network (as the multilayer perceptron or the radial basis function neural network). Its main advantage is the lower computational cost, which is especially relevant when dealing with many patterns defined in a high-dimensional space. This brief proposes a bayesian approach to ELM, which presents some advantages over other approaches: it allows the introduction of a priori knowledge; obtains the confidence intervals (CIs) without the need of applying methods that are computationally intensive, e.g., bootstrap; and presents high generalization capabilities. Bayesian ELM is benchmarked against classical ELM in several artificial and real datasets that are widely used for the evaluation of machine learning algorithms. Achieved results show that the proposed approach produces a competitive accuracy with some additional advantages, namely, automatic production of CIs, reduction of probability of model overfitting, and use of a priori knowledge.
BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.
Khakabimamaghani, Sahand; Ester, Martin
2016-01-01
The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.
Bayesian Nonparametric Longitudinal Data Analysis.
Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen
2016-01-01
Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.
Ensemble Bayesian forecasting system Part I: Theory and algorithms
Herr, Henry D.; Krzysztofowicz, Roman
2015-05-01
The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of
ANUBIS: artificial neuromodulation using a Bayesian inference system.
Smith, Benjamin J H; Saaj, Chakravarthini M; Allouis, Elie
2013-01-01
Gain tuning is a crucial part of controller design and depends not only on an accurate understanding of the system in question, but also on the designer's ability to predict what disturbances and other perturbations the system will encounter throughout its operation. This letter presents ANUBIS (artificial neuromodulation using a Bayesian inference system), a novel biologically inspired technique for automatically tuning controller parameters in real time. ANUBIS is based on the Bayesian brain concept and modifies it by incorporating a model of the neuromodulatory system comprising four artificial neuromodulators. It has been applied to the controller of EchinoBot, a prototype walking rover for Martian exploration. ANUBIS has been implemented at three levels of the controller; gait generation, foot trajectory planning using Bézier curves, and foot trajectory tracking using a terminal sliding mode controller. We compare the results to a similar system that has been tuned using a multilayer perceptron. The use of Bayesian inference means that the system retains mathematical interpretability, unlike other intelligent tuning techniques, which use neural networks, fuzzy logic, or evolutionary algorithms. The simulation results show that ANUBIS provides significant improvements in efficiency and adaptability of the three controller components; it allows the robot to react to obstacles and uncertainties faster than the system tuned with the MLP, while maintaining stability and accuracy. As well as advancing rover autonomy, ANUBIS could also be applied to other situations where operating conditions are likely to change or cannot be accurately modeled in advance, such as process control. In addition, it demonstrates one way in which neuromodulation could fit into the Bayesian brain framework.
2nd Bayesian Young Statisticians Meeting
Bitto, Angela; Kastner, Gregor; Posekany, Alexandra
2015-01-01
The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...
Bayesian natural language semantics and pragmatics
Zeevat, Henk
2015-01-01
The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice's contributions to pragmatics or in interpretation by abduction.
Tada, Toshifumi; Kumada, Takashi; Toyoda, Hidenori; Tsuji, Kunihiko; Hiraoka, Atsushi; Tanaka, Junko
2017-02-01
Nucleos(t)ide analogue (NA) therapy has been reported to reduce the risk of hepatocellular carcinoma (HCC) development in patients with chronic hepatitis B (CHB). However, even during NA therapy, development of HCC has been observed in patients with CHB. Therefore, we clarified the predictive power of clinical factors for HCC incidence using receiver operating characteristic (ROC) analysis that takes time dependence into account. A total of 539 patients with CHB treated with NAs were enrolled. Univariate, multivariate, and time-dependent ROC curves for clinical factors associated with the development of HCC were analyzed. Eighty-one patients developed HCC during the follow-up period (median duration, 5.9 years). α-fetoprotein (AFP) and FIB-4 index at 24 weeks from the initiation of treatment and sex were significantly associated with HCC incidence according to the log-rank test. Cox proportional hazards models including the covariates of sex, hepatitis B genotype, basal core promoter mutations, AFP at 24 weeks, and FIB-4 index at 24 weeks showed that FIB-4 index >2.65 (HR, 5.03; 95% CI, 3.06-8.26; P patients with CHB receiving NA therapy is a risk factor for developing HCC. The FIB-4 index is an excellent predictor of HCC development. © 2016 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.
Bayesian methods for chromosome dosimetry following a criticality accident
International Nuclear Information System (INIS)
Brame, R.S.; Groer, P.G.
2003-01-01
Radiation doses received during a criticality accident will be from a combination of fission spectrum neutrons and gamma rays. It is desirable to estimate the total dose, as well as the neutron and gamma doses. Present methods for dose estimation with chromosome aberrations after a criticality accident use point estimates of the neutron to gamma dose ratio obtained from personnel dosemeters and/or accident reconstruction calculations. In this paper a Bayesian approach to dose estimation with chromosome aberrations is developed that allows the uncertainty of the dose ratio to be considered. Posterior probability densities for the total and the neutron and gamma doses were derived. (author)
Refinement of Bayesian Network Structures upon New Data
DEFF Research Database (Denmark)
Zeng, Yifeng; Xiang, Yanping; Pacekajus, Saulius
2010-01-01
Refinement of Bayesian network (BN) structures using new data becomes more and more relevant. Some work has been done there; however, one problem has not been considered yet – what to do when new data have fewer or more attributes than the existing model. In both cases, data contain important...... knowledge and every effort must be made in order to extract it. In this paper, we propose a general merging algorithm to deal with situations when new data have different set of attributes. The merging algorithm updates sufficient statistics when new data are received. It expands the flexibility of BN...
Bayesian hierarchical modelling of North Atlantic windiness
Vanem, E.; Breivik, O. N.
2013-03-01
Extreme weather conditions represent serious natural hazards to ship operations and may be the direct cause or contributing factor to maritime accidents. Such severe environmental conditions can be taken into account in ship design and operational windows can be defined that limits hazardous operations to less extreme conditions. Nevertheless, possible changes in the statistics of extreme weather conditions, possibly due to anthropogenic climate change, represent an additional hazard to ship operations that is less straightforward to account for in a consistent way. Obviously, there are large uncertainties as to how future climate change will affect the extreme weather conditions at sea and there is a need for stochastic models that can describe the variability in both space and time at various scales of the environmental conditions. Previously, Bayesian hierarchical space-time models have been developed to describe the variability and complex dependence structures of significant wave height in space and time. These models were found to perform reasonably well and provided some interesting results, in particular, pertaining to long-term trends in the wave climate. In this paper, a similar framework is applied to oceanic windiness and the spatial and temporal variability of the 10-m wind speed over an area in the North Atlantic ocean is investigated. When the results from the model for North Atlantic windiness is compared to the results for significant wave height over the same area, it is interesting to observe that whereas an increasing trend in significant wave height was identified, no statistically significant long-term trend was estimated in windiness. This may indicate that the increase in significant wave height is not due to an increase in locally generated wind waves, but rather to increased swell. This observation is also consistent with studies that have suggested a poleward shift of the main storm tracks.
Bayesian hierarchical modelling of North Atlantic windiness
Directory of Open Access Journals (Sweden)
E. Vanem
2013-03-01
Full Text Available Extreme weather conditions represent serious natural hazards to ship operations and may be the direct cause or contributing factor to maritime accidents. Such severe environmental conditions can be taken into account in ship design and operational windows can be defined that limits hazardous operations to less extreme conditions. Nevertheless, possible changes in the statistics of extreme weather conditions, possibly due to anthropogenic climate change, represent an additional hazard to ship operations that is less straightforward to account for in a consistent way. Obviously, there are large uncertainties as to how future climate change will affect the extreme weather conditions at sea and there is a need for stochastic models that can describe the variability in both space and time at various scales of the environmental conditions. Previously, Bayesian hierarchical space-time models have been developed to describe the variability and complex dependence structures of significant wave height in space and time. These models were found to perform reasonably well and provided some interesting results, in particular, pertaining to long-term trends in the wave climate. In this paper, a similar framework is applied to oceanic windiness and the spatial and temporal variability of the 10-m wind speed over an area in the North Atlantic ocean is investigated. When the results from the model for North Atlantic windiness is compared to the results for significant wave height over the same area, it is interesting to observe that whereas an increasing trend in significant wave height was identified, no statistically significant long-term trend was estimated in windiness. This may indicate that the increase in significant wave height is not due to an increase in locally generated wind waves, but rather to increased swell. This observation is also consistent with studies that have suggested a poleward shift of the main storm tracks.
Bayesian Ranging for Radio Localization with and without Line-of-Sight Detection
DEFF Research Database (Denmark)
Jing, Lishuai; Pedersen, Troels; Fleury, Bernard Henri
2015-01-01
We consider Bayesian ranging methods for local- ization in wireless communication systems. Based on a channel model and given priors for the range and the line-of-sight (LOS) condition, we propose range estimators with and without LOS detection. Since the pdf of the received frequency...
Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations
Energy Technology Data Exchange (ETDEWEB)
Chen, Peng, E-mail: peng@ices.utexas.edu [The Institute for Computational Engineering and Sciences, The University of Texas at Austin, 201 East 24th Street, Stop C0200, Austin, TX 78712-1229 (United States); Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch [Seminar für Angewandte Mathematik, Eidgenössische Technische Hochschule, Römistrasse 101, CH-8092 Zürich (Switzerland)
2016-07-01
We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by the so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data
Directory of Open Access Journals (Sweden)
Rubens Angulo Filho
2002-01-01
Full Text Available Para avaliar a exatidão de posicionamento planimétrico do receptor GPS Trimble/Pro-XL, operando sob diferentes condições de cobertura vegetal (pastagem, seringueira, eucalipto e pinus, o equipamento foi posicionado alternadamente sobre 6 pontos, locados ao acaso nas áreas de estudo, variando o tempo de permanência (1 , 5 e 10 min mas com a mesma taxa de aquisição de dados (1 s fazendo-se, posteriormente, a correção diferencial (DGPS pós-processada dos dados. Os pontos também tiveram suas coordenadas levantadas pelo método topográfico, segundo a NBR 13133 - Execução de Levantamento Topográfico, para fins de comparação. De acordo com o método empregado e os resultados obtidos, foi possível separar as exatidões de posicionamento planimétrico, conforme o tipo de cobertura vegetal, em dois grupos: sem e com cobertura arbórea confirmando, assim, a interferência do dossel na recepção dos sinais emitidos pelos satélites GPS. O aumento do tempo de permanência melhorou a exatidão de posicionamento planimétrico, o que ratifica a escolha da metodologia de levantamento como sendo fundamental para a obtenção de bons resultados de posicionamento.To evaluate planimetric positioning accuracy of a GPS receiver (Trimble/Pro-XL, operating under different conditions of vegetation cover (pasture, rubber trees, eucalyptus and pine trees, 6 control points were located randomly in the study area. For comparison, their coordinates were first obtained by a conventional surveying method, according to NBR 13133 of Brazilian Surveying Standards. Afterwards, the GPS receiver was positioned on those control points, maintaining the acquisition rate of 1 s while changing the time for 1, 5 and 10 min, the DGPS method was used to correct the positioning coordinate data. According to the methodology applied and the results obtained, it was possible to distinguish planimetric positioning accuracy, according to the vegetation cover, in two groups
A Bayesian Reflection on Surfaces
Directory of Open Access Journals (Sweden)
David R. Wolf
1999-10-01
Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.
Attention in a bayesian framework
DEFF Research Database (Denmark)
Whiteley, Louise Emma; Sahani, Maneesh
2012-01-01
, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...... selective and integrative roles, and thus cannot be easily extended to complex environments. We suggest that the resource bottleneck stems from the computational intractability of exact perceptual inference in complex settings, and that attention reflects an evolved mechanism for approximate inference which...... can be shaped to refine the local accuracy of perception. We show that this approach extends the simple picture of attention as prior, so as to provide a unified and computationally driven account of both selective and integrative attentional phenomena....
Nonparametric Bayesian inference in biostatistics
Müller, Peter
2015-01-01
As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...
Bayesian estimation in homodyne interferometry
International Nuclear Information System (INIS)
Olivares, Stefano; Paris, Matteo G A
2009-01-01
We address phase-shift estimation by means of squeezed vacuum probe and homodyne detection. We analyse Bayesian estimator, which is known to asymptotically saturate the classical Cramer-Rao bound to the variance, and discuss convergence looking at the a posteriori distribution as the number of measurements increases. We also suggest two feasible adaptive methods, acting on the squeezing parameter and/or the homodyne local oscillator phase, which allow us to optimize homodyne detection and approach the ultimate bound to precision imposed by the quantum Cramer-Rao theorem. The performances of our two-step methods are investigated by means of Monte Carlo simulated experiments with a small number of homodyne data, thus giving a quantitative meaning to the notion of asymptotic optimality.
Bayesian Kernel Mixtures for Counts.
Canale, Antonio; Dunson, David B
2011-12-01
Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online.
Bayesian networks in educational assessment
Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M
2015-01-01
Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...
Bayesian Action&Perception: Representing the World in the Brain
Directory of Open Access Journals (Sweden)
Gerald E. Loeb
2014-10-01
Full Text Available Theories of perception seek to explain how sensory data are processed to identify previously experienced objects, but they usually do not consider the decisions and effort that goes into acquiring the sensory data. Identification of objects according to their tactile properties requires active exploratory movements. The sensory data thereby obtained depend on the details of those movements, which human subjects change rapidly and seemingly capriciously. Bayesian Exploration is an algorithm that uses prior experience to decide which next exploratory movement should provide the most useful data to disambiguate the most likely possibilities. In previous studies, a simple robot equipped with a biomimetic tactile sensor and operated according to Bayesian Exploration performed in a manner similar to and actually better than humans on a texture identification task. Expanding on this, Bayesian Action&Perception refers to the construction and querying of an associative memory of previously experienced entities containing both sensory data and the motor programs that elicited them. We hypothesize that this memory can be queried i to identify useful next exploratory movements during identification of an unknown entity (action for perception or ii to characterize whether an unknown entity is fit for purpose (perception for action or iii to recall what actions might be feasible for a known entity (Gibsonian affordance. The biomimetic design of this mechatronic system may provide insights into the neuronal basis of biological action and perception.
Robust bayesian analysis of an autoregressive model with ...
African Journals Online (AJOL)
In this work, robust Bayesian analysis of the Bayesian estimation of an autoregressive model with exponential innovations is performed. Using a Bayesian robustness methodology, we show that, using a suitable generalized quadratic loss, we obtain optimal Bayesian estimators of the parameters corresponding to the ...
International Nuclear Information System (INIS)
Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang
2014-01-01
MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2 × 10 −4 ), 283 for the intensity approach (p = 2 × 10 −6 ) and 282
Bayesian models a statistical primer for ecologists
Hobbs, N Thompson
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili
Alehosseini, Ali; A. Hejazi, Maryam; Mokhtari, Ghassem; B. Gharehpetian, Gevork; Mohammadi, Mohammad
2015-06-01
In this paper, the Bayesian classifier is used to detect and classify the radial deformation and axial displacement of transformer windings. The proposed method is tested on a model of transformer for different volumes of radial deformation and axial displacement. In this method, ultra-wideband (UWB) signal is sent to the simplified model of the transformer winding. The received signal from the winding model is recorded and used for training and testing of Bayesian classifier in different axial displacement and radial deformation states of the winding. It is shown that the proposed method has a good accuracy to detect and classify the axial displacement and radial deformation of the winding.
MACROECONOMIC FORECASTING USING BAYESIAN VECTOR AUTOREGRESSIVE APPROACH
Directory of Open Access Journals (Sweden)
D. Tutberidze
2017-04-01
Full Text Available There are many arguments that can be advanced to support the forecasting activities of business entities. The underlying argument in favor of forecasting is that managerial decisions are significantly dependent on proper evaluation of future trends as market conditions are constantly changing and require a detailed analysis of future dynamics. The article discusses the importance of using reasonable macro-econometric tool by suggesting the idea of conditional forecasting through a Vector Autoregressive (VAR modeling framework. Under this framework, a macroeconomic model for Georgian economy is constructed with the few variables believed to be shaping business environment. Based on the model, forecasts of macroeconomic variables are produced, and three types of scenarios are analyzed - a baseline and two alternative ones. The results of the study provide confirmatory evidence that suggested methodology is adequately addressing the research phenomenon and can be used widely by business entities in responding their strategic and operational planning challenges. Given this set-up, it is shown empirically that Bayesian Vector Autoregressive approach provides reasonable forecasts for the variables of interest.
Bayesian Networks for enterprise risk assessment
Bonafede, C. E.; Giudici, P.
2007-08-01
According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. Risk, in general, is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (either qualitative or quantitative data) are used. Moreover, qualitative data must be converted in numerical values or bounds to be used in the model. In the case of enterprise risk assessment the considered risks are, for instance, strategic, operational, legal and of image, which many times are difficult to be quantified. So in most cases only expert data, gathered by scorecard approaches, are available for risk analysis. The Bayesian Networks (BNs) are a useful tool to integrate different information and in particular to study the risk's joint distribution by using data collected from experts. In this paper we want to show a possible approach for building a BN in the particular case in which only prior probabilities of node states and marginal correlations between nodes are available, and when the variables have only two states.
29 CFR 1917.155 - Air receivers.
2010-07-01
.... This section applies to compressed air receivers and equipment used for operations such as cleaning... transportation applications as railways, vehicles or cranes. (b) Gauges and valves. (1) Air receivers shall be... 29 Labor 7 2010-07-01 2010-07-01 false Air receivers. 1917.155 Section 1917.155 Labor Regulations...
49 CFR 393.88 - Television receivers.
2010-10-01
... 49 Transportation 5 2010-10-01 2010-10-01 false Television receivers. 393.88 Section 393.88... NECESSARY FOR SAFE OPERATION Miscellaneous Parts and Accessories § 393.88 Television receivers. Any motor vehicle equipped with a television viewer, screen or other means of visually receiving a television...
Uncertainty Management for Diagnostics and Prognostics of Batteries using Bayesian Techniques
Saha, Bhaskar; Goebel, kai
2007-01-01
Uncertainty management has always been the key hurdle faced by diagnostics and prognostics algorithms. A Bayesian treatment of this problem provides an elegant and theoretically sound approach to the modern Condition- Based Maintenance (CBM)/Prognostic Health Management (PHM) paradigm. The application of the Bayesian techniques to regression and classification in the form of Relevance Vector Machine (RVM), and to state estimation as in Particle Filters (PF), provides a powerful tool to integrate the diagnosis and prognosis of battery health. The RVM, which is a Bayesian treatment of the Support Vector Machine (SVM), is used for model identification, while the PF framework uses the learnt model, statistical estimates of noise and anticipated operational conditions to provide estimates of remaining useful life (RUL) in the form of a probability density function (PDF). This type of prognostics generates a significant value addition to the management of any operation involving electrical systems.
Aggregated Residential Load Modeling Using Dynamic Bayesian Networks
Energy Technology Data Exchange (ETDEWEB)
Vlachopoulou, Maria; Chin, George; Fuller, Jason C.; Lu, Shuai
2014-09-28
Abstract—It is already obvious that the future power grid will have to address higher demand for power and energy, and to incorporate renewable resources of different energy generation patterns. Demand response (DR) schemes could successfully be used to manage and balance power supply and demand under operating conditions of the future power grid. To achieve that, more advanced tools for DR management of operations and planning are necessary that can estimate the available capacity from DR resources. In this research, a Dynamic Bayesian Network (DBN) is derived, trained, and tested that can model aggregated load of Heating, Ventilation, and Air Conditioning (HVAC) systems. DBNs can provide flexible and powerful tools for both operations and planing, due to their unique analytical capabilities. The DBN model accuracy and flexibility of use is demonstrated by testing the model under different operational scenarios.
Bayesian Travel Time Inversion adopting Gaussian Process Regression
Mauerberger, S.; Holschneider, M.
2017-12-01
A major application in seismology is the determination of seismic velocity models. Travel time measurements are putting an integral constraint on the velocity between source and receiver. We provide insight into travel time inversion from a correlation-based Bayesian point of view. Therefore, the concept of Gaussian process regression is adopted to estimate a velocity model. The non-linear travel time integral is approximated by a 1st order Taylor expansion. A heuristic covariance describes correlations amongst observations and a priori model. That approach enables us to assess a proxy of the Bayesian posterior distribution at ordinary computational costs. No multi dimensional numeric integration nor excessive sampling is necessary. Instead of stacking the data, we suggest to progressively build the posterior distribution. Incorporating only a single evidence at a time accounts for the deficit of linearization. As a result, the most probable model is given by the posterior mean whereas uncertainties are described by the posterior covariance.As a proof of concept, a synthetic purely 1d model is addressed. Therefore a single source accompanied by multiple receivers is considered on top of a model comprising a discontinuity. We consider travel times of both phases - direct and reflected wave - corrupted by noise. Left and right of the interface are assumed independent where the squared exponential kernel serves as covariance.
Bayesian adaptive methods for clinical trials
National Research Council Canada - National Science Library
Berry, Scott M
2011-01-01
.... One is that Bayesian approaches implemented with the majority of their informative content coming from the current data, and not any external prior informa- tion, typically have good frequentist properties (e.g...
A Bayesian approach to model uncertainty
International Nuclear Information System (INIS)
Buslik, A.
1994-01-01
A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given
Structure-based bayesian sparse reconstruction
Quadeer, Ahmed Abdul; Al-Naffouri, Tareq Y.
2012-01-01
Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical
An Intuitive Dashboard for Bayesian Network Inference
International Nuclear Information System (INIS)
Reddy, Vikas; Farr, Anna Charisse; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K D V
2014-01-01
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++
An Intuitive Dashboard for Bayesian Network Inference
Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.
2014-03-01
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
Bayesian optimization for computationally extensive probability distributions.
Tamura, Ryo; Hukushima, Koji
2018-01-01
An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.
Correct Bayesian and frequentist intervals are similar
International Nuclear Information System (INIS)
Atwood, C.L.
1986-01-01
This paper argues that Bayesians and frequentists will normally reach numerically similar conclusions, when dealing with vague data or sparse data. It is shown that both statistical methodologies can deal reasonably with vague data. With sparse data, in many important practical cases Bayesian interval estimates and frequentist confidence intervals are approximately equal, although with discrete data the frequentist intervals are somewhat longer. This is not to say that the two methodologies are equally easy to use: The construction of a frequentist confidence interval may require new theoretical development. Bayesians methods typically require numerical integration, perhaps over many variables. Also, Bayesian can easily fall into the trap of over-optimism about their amount of prior knowledge. But in cases where both intervals are found correctly, the two intervals are usually not very different. (orig.)
Implementing the Bayesian paradigm in risk analysis
International Nuclear Information System (INIS)
Aven, T.; Kvaloey, J.T.
2002-01-01
The Bayesian paradigm comprises a unified and consistent framework for analyzing and expressing risk. Yet, we see rather few examples of applications where the full Bayesian setting has been adopted with specifications of priors of unknown parameters. In this paper, we discuss some of the practical challenges of implementing Bayesian thinking and methods in risk analysis, emphasizing the introduction of probability models and parameters and associated uncertainty assessments. We conclude that there is a need for a pragmatic view in order to 'successfully' apply the Bayesian approach, such that we can do the assignments of some of the probabilities without adopting the somewhat sophisticated procedure of specifying prior distributions of parameters. A simple risk analysis example is presented to illustrate ideas
An overview on Approximate Bayesian computation*
Directory of Open Access Journals (Sweden)
Baragatti Meïli
2014-01-01
Full Text Available Approximate Bayesian computation techniques, also called likelihood-free methods, are one of the most satisfactory approach to intractable likelihood problems. This overview presents recent results since its introduction about ten years ago in population genetics.
Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.
2003-01-01
Flexible radio receivers are also called Software Defined Radios (SDRs) [1], [2]. The focus of our SDR project [3] is on designing the front end, from antenna to demodulation in bits, of a °exible, multi-standard WLAN receiver. We try to combine an instance of a (G)FSK receiver (Bluetooth) with an
Survival Bayesian Estimation of Exponential-Gamma Under Linex Loss Function
Rizki, S. W.; Mara, M. N.; Sulistianingsih, E.
2017-06-01
This paper elaborates a research of the cancer patients after receiving a treatment in cencored data using Bayesian estimation under Linex Loss function for Survival Model which is assumed as an exponential distribution. By giving Gamma distribution as prior and likelihood function produces a gamma distribution as posterior distribution. The posterior distribution is used to find estimatior {\\hat{λ }}BL by using Linex approximation. After getting {\\hat{λ }}BL, the estimators of hazard function {\\hat{h}}BL and survival function {\\hat{S}}BL can be found. Finally, we compare the result of Maximum Likelihood Estimation (MLE) and Linex approximation to find the best method for this observation by finding smaller MSE. The result shows that MSE of hazard and survival under MLE are 2.91728E-07 and 0.000309004 and by using Bayesian Linex worths 2.8727E-07 and 0.000304131, respectively. It concludes that the Bayesian Linex is better than MLE.
Bayesian probability theory and inverse problems
International Nuclear Information System (INIS)
Kopec, S.
1994-01-01
Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)
A Bayesian classifier for symbol recognition
Barrat , Sabine; Tabbone , Salvatore; Nourrissier , Patrick
2007-01-01
URL : http://www.buyans.com/POL/UploadedFile/134_9977.pdf; International audience; We present in this paper an original adaptation of Bayesian networks to symbol recognition problem. More precisely, a descriptor combination method, which enables to improve significantly the recognition rate compared to the recognition rates obtained by each descriptor, is presented. In this perspective, we use a simple Bayesian classifier, called naive Bayes. In fact, probabilistic graphical models, more spec...
Bayesian Modeling of a Human MMORPG Player
Synnaeve, Gabriel; Bessière, Pierre
2011-03-01
This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.
Variations on Bayesian Prediction and Inference
2016-05-09
inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle
Bayesian target tracking based on particle filter
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.
MCMC for parameters estimation by bayesian approach
International Nuclear Information System (INIS)
Ait Saadi, H.; Ykhlef, F.; Guessoum, A.
2011-01-01
This article discusses the parameter estimation for dynamic system by a Bayesian approach associated with Markov Chain Monte Carlo methods (MCMC). The MCMC methods are powerful for approximating complex integrals, simulating joint distributions, and the estimation of marginal posterior distributions, or posterior means. The MetropolisHastings algorithm has been widely used in Bayesian inference to approximate posterior densities. Calibrating the proposal distribution is one of the main issues of MCMC simulation in order to accelerate the convergence.
Bayesian Networks for Modeling Dredging Decisions
2011-10-01
years, that algorithms have been developed to solve these problems efficiently. Most modern Bayesian network software uses junction tree (a.k.a. join... software was used to develop the network . This is by no means an exhaustive list of Bayesian network applications, but it is representative of recent...characteristic node (SCN), state- defining node ( SDN ), effect node (EFN), or value node. The five types of nodes can be described as follows: ERDC/EL TR-11
A Bayesian Method for Weighted Sampling
Lo, Albert Y.
1993-01-01
Bayesian statistical inference for sampling from weighted distribution models is studied. Small-sample Bayesian bootstrap clone (BBC) approximations to the posterior distribution are discussed. A second-order property for the BBC in unweighted i.i.d. sampling is given. A consequence is that BBC approximations to a posterior distribution of the mean and to the sampling distribution of the sample average, can be made asymptotically accurate by a proper choice of the random variables that genera...
Directory of Open Access Journals (Sweden)
Hyunseok Jee
2018-01-01
Full Text Available We aimed to investigate the characteristics of patients with osteoarthritis (OA, using the data of all Koreans registered in the National Health Insurance Sharing Service Database (NHISS DB, and to provide ideal alternative cutoff thresholds for alleviating OA symptoms. Patients with OA (codes M17 and M17.1–M17.9 in the Korean Standard Classification of Disease and Causes of Death were analyzed using SAS software. Optimal cutoff thresholds were determined using receiver operating characteristic curve analysis. The 50-year age group was the most OA pathogenic group (among 40~70 years, n=2088. All exercise types affected the change of body mass index (p<0.05 and the sex difference in blood pressure (BP (p<0.01. All types of exercise positively affected the loss of waist circumference and the balance test (standing time on one leg in seconds (p<0.01. The cutoff threshold for the time in seconds from standing up from a chair to walking 3 m and returning to the same chair was 8.25 (80% sensitivity and 100% specificity. By using the exercise modalities, categorized multiple variables, and the cutoff threshold, an optimal alternative exercise program can be designed for alleviating OA symptoms in the 50-year age group.
Highly Sensitive Optical Receivers
Schneider, Kerstin
2006-01-01
Highly Sensitive Optical Receivers primarily treats the circuit design of optical receivers with external photodiodes. Continuous-mode and burst-mode receivers are compared. The monograph first summarizes the basics of III/V photodetectors, transistor and noise models, bit-error rate, sensitivity and analog circuit design, thus enabling readers to understand the circuits described in the main part of the book. In order to cover the topic comprehensively, detailed descriptions of receivers for optical data communication in general and, in particular, optical burst-mode receivers in deep-sub-µm CMOS are presented. Numerous detailed and elaborate illustrations facilitate better understanding.
Philosophy and the practice of Bayesian statistics.
Gelman, Andrew; Shalizi, Cosma Rohilla
2013-02-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.
Bayesian inversion of refraction seismic traveltime data
Ryberg, T.; Haberland, Ch
2018-03-01
We apply a Bayesian Markov chain Monte Carlo (McMC) formalism to the inversion of refraction seismic, traveltime data sets to derive 2-D velocity models below linear arrays (i.e. profiles) of sources and seismic receivers. Typical refraction data sets, especially when using the far-offset observations, are known as having experimental geometries which are very poor, highly ill-posed and far from being ideal. As a consequence, the structural resolution quickly degrades with depth. Conventional inversion techniques, based on regularization, potentially suffer from the choice of appropriate inversion parameters (i.e. number and distribution of cells, starting velocity models, damping and smoothing constraints, data noise level, etc.) and only local model space exploration. McMC techniques are used for exhaustive sampling of the model space without the need of prior knowledge (or assumptions) of inversion parameters, resulting in a large number of models fitting the observations. Statistical analysis of these models allows to derive an average (reference) solution and its standard deviation, thus providing uncertainty estimates of the inversion result. The highly non-linear character of the inversion problem, mainly caused by the experiment geometry, does not allow to derive a reference solution and error map by a simply averaging procedure. We present a modified averaging technique, which excludes parts of the prior distribution in the posterior values due to poor ray coverage, thus providing reliable estimates of inversion model properties even in those parts of the models. The model is discretized by a set of Voronoi polygons (with constant slowness cells) or a triangulated mesh (with interpolation within the triangles). Forward traveltime calculations are performed by a fast, finite-difference-based eikonal solver. The method is applied to a data set from a refraction seismic survey from Northern Namibia and compared to conventional tomography. An inversion test
EXONEST: The Bayesian Exoplanetary Explorer
Directory of Open Access Journals (Sweden)
Kevin H. Knuth
2017-10-01
Full Text Available The fields of astronomy and astrophysics are currently engaged in an unprecedented era of discovery as recent missions have revealed thousands of exoplanets orbiting other stars. While the Kepler Space Telescope mission has enabled most of these exoplanets to be detected by identifying transiting events, exoplanets often exhibit additional photometric effects that can be used to improve the characterization of exoplanets. The EXONEST Exoplanetary Explorer is a Bayesian exoplanet inference engine based on nested sampling and originally designed to analyze archived Kepler Space Telescope and CoRoT (Convection Rotation et Transits planétaires exoplanet mission data. We discuss the EXONEST software package and describe how it accommodates plug-and-play models of exoplanet-associated photometric effects for the purpose of exoplanet detection, characterization and scientific hypothesis testing. The current suite of models allows for both circular and eccentric orbits in conjunction with photometric effects, such as the primary transit and secondary eclipse, reflected light, thermal emissions, ellipsoidal variations, Doppler beaming and superrotation. We discuss our new efforts to expand the capabilities of the software to include more subtle photometric effects involving reflected and refracted light. We discuss the EXONEST inference engine design and introduce our plans to port the current MATLAB-based EXONEST software package over to the next generation Exoplanetary Explorer, which will be a Python-based open source project with the capability to employ third-party plug-and-play models of exoplanet-related photometric effects.
Maximum entropy and Bayesian methods
International Nuclear Information System (INIS)
Smith, C.R.; Erickson, G.J.; Neudorfer, P.O.
1992-01-01
Bayesian probability theory and Maximum Entropy methods are at the core of a new view of scientific inference. These 'new' ideas, along with the revolution in computational methods afforded by modern computers allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. The title workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this book. There are tutorial and theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. Contributions contained in this volume present a state-of-the-art overview that will be influential and useful for many years to come
Inverse problems in the Bayesian framework
International Nuclear Information System (INIS)
Calvetti, Daniela; Somersalo, Erkki; Kaipio, Jari P
2014-01-01
The history of Bayesian methods dates back to the original works of Reverend Thomas Bayes and Pierre-Simon Laplace: the former laid down some of the basic principles on inverse probability in his classic article ‘An essay towards solving a problem in the doctrine of chances’ that was read posthumously in the Royal Society in 1763. Laplace, on the other hand, in his ‘Memoirs on inverse probability’ of 1774 developed the idea of updating beliefs and wrote down the celebrated Bayes’ formula in the form we know today. Although not identified yet as a framework for investigating inverse problems, Laplace used the formalism very much in the spirit it is used today in the context of inverse problems, e.g., in his study of the distribution of comets. With the evolution of computational tools, Bayesian methods have become increasingly popular in all fields of human knowledge in which conclusions need to be drawn based on incomplete and noisy data. Needless to say, inverse problems, almost by definition, fall into this category. Systematic work for developing a Bayesian inverse problem framework can arguably be traced back to the 1980s, (the original first edition being published by Elsevier in 1987), although articles on Bayesian methodology applied to inverse problems, in particular in geophysics, had appeared much earlier. Today, as testified by the articles in this special issue, the Bayesian methodology as a framework for considering inverse problems has gained a lot of popularity, and it has integrated very successfully with many traditional inverse problems ideas and techniques, providing novel ways to interpret and implement traditional procedures in numerical analysis, computational statistics, signal analysis and data assimilation. The range of applications where the Bayesian framework has been fundamental goes from geophysics, engineering and imaging to astronomy, life sciences and economy, and continues to grow. There is no question that Bayesian
Czech Academy of Sciences Publication Activity Database
Fernandes, R.; Eley, Y.; Brabec, Marek; Lucquin, A.; Millard, A.; Craig, O.E.
2018-01-01
Roč. 117, March (2018), s. 31-42 ISSN 0146-6380 Institutional support: RVO:67985807 Keywords : Fatty acids * carbon isotopes * pottery use * Bayesian mixing models * FRUITS Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.081, year: 2016
Czech Academy of Sciences Publication Activity Database
Fernandes, R.; Millard, A.R.; Brabec, Marek; Nadeau, M.J.; Grootes, P.
2014-01-01
Roč. 9, č. 2 (2014), Art . no. e87436 E-ISSN 1932-6203 Institutional support: RVO:67985807 Keywords : ancienit diet reconstruction * stable isotope measurements * mixture model * Bayesian estimation * Dirichlet prior Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.234, year: 2014
Delphi Accounts Receivable Module -
Department of Transportation — Delphi accounts receivable module contains the following data elements, but are not limited to customer information, cash receipts, line of accounting details, bill...
Constitution and application of reactor make-up system's fault diagnostic Bayesian networks
International Nuclear Information System (INIS)
Liang Jie; Cai Qi; Chu Zhuli; Wang Haiping
2013-01-01
A fault diagnostic Bayesian network of reactor make-up system was constituted. The system's structure characters, operation rules and experts' experience were combined and an initial net was built. As the fault date sets were learned with the particle swarm optimization based Bayesian network structure, the structure of diagnostic net was completed and used to inference case. The built net can analyze diagnostic probability of every node in the net and afford assistant decision to fault diagnosis. (authors)
Moreno, Isabel María; Herrador, M Ángeles; Atencio, Loyda; Puerto, María; González, A Gustavo; Cameán, Ana María
2011-02-01
The aim of this study was to evaluate whether the enzyme-linked immunosorbent assay (ELISA) anti-Adda technique could be used to monitor free microcystins (MCs) in biological samples from fish naturally exposed to toxic cyanobacteria by using receiver operating characteristic (ROC) curve software to establish an optimal cut-off value for MCs. The cut-off value determined by ROC curve analysis in tench (Tinca tinca) exposed to MCs under laboratory conditions by ROC curve analysis was 5.90-μg MCs/kg tissue dry weight (d.w.) with a sensitivity of 93.3%. This value was applied in fish samples from natural ponds (Extremadura, Spain) in order to asses its potential MCs bioaccumulation by classifying samples as either true positive (TP), false positive (FP), true negative (TN), or false negative (FN). In this work, it has been demonstrated that toxic cyanobacteria, mainly Microcystis aeruginosa, Aphanizomenon issatchenkoi, and Anabaena spiroides, were present in two of these ponds, Barruecos de Abajo (BDown) and Barruecos de Arriba (BUp). The MCs levels were detected in waters from both ponds with an anti-MC-LR ELISA immunoassay and were of similar values (between 3.8-6.5-μg MC-LR equivalent/L in BDown pond and 4.8-6.0-μg MC-LR equivalent/L in BUp). The MCs cut-off values were applied in livers from fish collected from these two ponds using the ELISA anti-Adda technique. A total of 83% of samples from BDown pond and only 42% from BUp were TP with values of free MCs higher than 8.8-μg MCs/kg tissue (d.w.). Copyright © 2009 Wiley Periodicals, Inc.
Bayesian calibration of power plant models for accurate performance prediction
International Nuclear Information System (INIS)
Boksteen, Sowande Z.; Buijtenen, Jos P. van; Pecnik, Rene; Vecht, Dick van der
2014-01-01
Highlights: • Bayesian calibration is applied to power plant performance prediction. • Measurements from a plant in operation are used for model calibration. • A gas turbine performance model and steam cycle model are calibrated. • An integrated plant model is derived. • Part load efficiency is accurately predicted as a function of ambient conditions. - Abstract: Gas turbine combined cycles are expected to play an increasingly important role in the balancing of supply and demand in future energy markets. Thermodynamic modeling of these energy systems is frequently applied to assist in decision making processes related to the management of plant operation and maintenance. In most cases, model inputs, parameters and outputs are treated as deterministic quantities and plant operators make decisions with limited or no regard of uncertainties. As the steady integration of wind and solar energy into the energy market induces extra uncertainties, part load operation and reliability are becoming increasingly important. In the current study, methods are proposed to not only quantify various types of uncertainties in measurements and plant model parameters using measured data, but to also assess their effect on various aspects of performance prediction. The authors aim to account for model parameter and measurement uncertainty, and for systematic discrepancy of models with respect to reality. For this purpose, the Bayesian calibration framework of Kennedy and O’Hagan is used, which is especially suitable for high-dimensional industrial problems. The article derives a calibrated model of the plant efficiency as a function of ambient conditions and operational parameters, which is also accurate in part load. The article shows that complete statistical modeling of power plants not only enhances process models, but can also increases confidence in operational decisions
HIGH-EFFICIENCY INFRARED RECEIVER
Directory of Open Access Journals (Sweden)
A. K. Esman
2016-01-01
Full Text Available Recent research and development show promising use of high-performance solid-state receivers of the electromagnetic radiation. These receivers are based on the low-barrier Schottky diodes. The approach to the design of the receivers on the basis of delta-doped low-barrier Schottky diodes with beam leads without bias is especially actively developing because for uncooled receivers of the microwave radiation these diodes have virtually no competition. The purpose of this work is to improve the main parameters and characteristics that determine the practical relevance of the receivers of mid-infrared electromagnetic radiation at the operating room temperature by modifying the electrodes configuration of the diode and optimizing the distance between them. Proposed original design solution of the integrated receiver of mid-infrared radiation on the basis of the low-barrier Schottky diodes with beam leads allows to effectively adjust its main parameters and characteristics. Simulation of the electromagnetic characteristics of the proposed receiver by using the software package HFSS with the basic algorithm of a finite element method which implemented to calculate the behavior of electromagnetic fields on an arbitrary geometry with a predetermined material properties have shown that when the inner parts of the electrodes of the low-barrier Schottky diode is performed in the concentric elliptical convex-concave shape, it can be reduce the reflection losses to -57.75 dB and the standing wave ratio to 1.003 while increasing the directivity up to 23 at a wavelength of 6.09 μm. At this time, the rounded radii of the inner parts of the anode and cathode electrodes are equal 212 nm and 318 nm respectively and the gap setting between them is 106 nm. These parameters will improve the efficiency of the developed infrared optical-promising and electronic equipment for various purposes intended for work in the mid-infrared wavelength range.
Objective Bayesianism and the Maximum Entropy Principle
Directory of Open Access Journals (Sweden)
Jon Williamson
2013-09-01
Full Text Available Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.
A default Bayesian hypothesis test for mediation.
Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan
2015-03-01
In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).
Classifying emotion in Twitter using Bayesian network
Surya Asriadie, Muhammad; Syahrul Mubarok, Mohamad; Adiwijaya
2018-03-01
Language is used to express not only facts, but also emotions. Emotions are noticeable from behavior up to the social media statuses written by a person. Analysis of emotions in a text is done in a variety of media such as Twitter. This paper studies classification of emotions on twitter using Bayesian network because of its ability to model uncertainty and relationships between features. The result is two models based on Bayesian network which are Full Bayesian Network (FBN) and Bayesian Network with Mood Indicator (BNM). FBN is a massive Bayesian network where each word is treated as a node. The study shows the method used to train FBN is not very effective to create the best model and performs worse compared to Naive Bayes. F1-score for FBN is 53.71%, while for Naive Bayes is 54.07%. BNM is proposed as an alternative method which is based on the improvement of Multinomial Naive Bayes and has much lower computational complexity compared to FBN. Even though it’s not better compared to FBN, the resulting model successfully improves the performance of Multinomial Naive Bayes. F1-Score for Multinomial Naive Bayes model is 51.49%, while for BNM is 52.14%.
A Bayesian Approach for Sensor Optimisation in Impact Identification
Directory of Open Access Journals (Sweden)
Vincenzo Mallardo
2016-11-01
Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.
Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan
2018-05-01
Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.
Graziani, Rebecca; Guindani, Michele; Thall, Peter F.
2015-01-01
Summary The effect of a targeted agent on a cancer patient's clinical outcome putatively is mediated through the agent's effect on one or more early biological events. This is motivated by pre-clinical experiments with cells or animals that identify such events, represented by binary or quantitative biomarkers. When evaluating targeted agents in humans, central questions are whether the distribution of a targeted biomarker changes following treatment, the nature and magnitude of this change, and whether it is associated with clinical outcome. Major difficulties in estimating these effects are that a biomarker's distribution may be complex, vary substantially between patients, and have complicated relationships with clinical outcomes. We present a probabilistically coherent framework for modeling and estimation in this setting, including a hierarchical Bayesian nonparametric mixture model for biomarkers that we use to define a functional profile of pre-versus-post treatment biomarker distribution change. The functional is similar to the receiver operating characteristic used in diagnostic testing. The hierarchical model yields clusters of individual patient biomarker profile functionals, and we use the profile as a covariate in a regression model for clinical outcome. The methodology is illustrated by analysis of a dataset from a clinical trial in prostate cancer using imatinib to target platelet-derived growth factor, with the clinical aim to improve progression-free survival time. PMID:25319212
Empirical Bayesian inference and model uncertainty
International Nuclear Information System (INIS)
Poern, K.
1994-01-01
This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability
Bayesian Inference Methods for Sparse Channel Estimation
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand
2013-01-01
This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...
Bayesian Methods for Radiation Detection and Dosimetry
Groer, Peter G
2002-01-01
We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...
Bayesian estimation of dose rate effectiveness
International Nuclear Information System (INIS)
Arnish, J.J.; Groer, P.G.
2000-01-01
A Bayesian statistical method was used to quantify the effectiveness of high dose rate 137 Cs gamma radiation at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice. The Bayesian approach considers both the temporal and dose dependence of radiation carcinogenesis and total mortality. This paper provides the first direct estimation of dose rate effectiveness using Bayesian statistics. This statistical approach provides a quantitative description of the uncertainty of the factor characterising the dose rate in terms of a probability density function. The results show that a fixed dose from 137 Cs gamma radiation delivered at a high dose rate is more effective at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice than the same dose delivered at a low dose rate. (author)
Empirical verification for application of Bayesian inference in situation awareness evaluations
International Nuclear Information System (INIS)
Kang, Seongkeun; Kim, Ar Ryum; Seong, Poong Hyun
2017-01-01
Highlights: • Situation awareness (SA) of human operators is significantly important for safe operation in nuclear power plants (NPPs). • SA of human operators was empirically estimated using Bayesian inference. • In this empirical study, the effect of attention and working memory to SA was considered. • Complexcity of the given task and design of human machine interface (HMI) considerably affect SA of human operators. - Abstract: Bayesian methodology has been widely used in various research fields. According to current research, malfunctions of nuclear power plants can be detected using this Bayesian inference, which consistently piles up newly incoming data and updates the estimation. However, these studies have been based on the assumption that people work like computers—perfectly—a supposition that may cause a problem in real world applications. Studies in cognitive psychology indicate that when the amount of information to be processed becomes larger, people cannot save the whole set of data in their heads due to limited attention and limited memory capacity, also known as working memory. The purpose of the current research is to consider how actual human aware the situation contrasts with our expectations, and how such disparity affects the results of conventional Bayesian inference, if at all. We compared situation awareness (SA) of ideal operators with SA of human operators, and for the human operator we used both text-based human machine interface (HMI) and infographic-based HMI to further compare two existing human operators. In addition, two different scenarios were selected how scenario complexity affects SA of human operators. As a results, when a malfunction occurred, the ideal operator found the malfunction nearly 100% probability of the time using Bayesian inference. In contrast, out of forty-six human operators, only 69.57% found the correct malfunction with simple scenario and 58.70% with complex scenario in the text-based HMI. In
A nonparametric Bayesian approach for genetic evaluation in ...
African Journals Online (AJOL)
South African Journal of Animal Science ... the Bayesian and Classical models, a Bayesian procedure is provided which allows these random ... data from the Elsenburg Dormer sheep stud and data from a simulation experiment are utilized. >
Bayesian disease mapping: hierarchical modeling in spatial epidemiology
National Research Council Canada - National Science Library
Lawson, Andrew
2013-01-01
.... Exploring these new developments, Bayesian Disease Mapping: Hierarchical Modeling in Spatial Epidemiology, Second Edition provides an up-to-date, cohesive account of the full range of Bayesian disease mapping methods and applications...
Sparse reconstruction using distribution agnostic bayesian matching pursuit
Masood, Mudassir; Al-Naffouri, Tareq Y.
2013-01-01
A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics
The Bayesian Approach to Association
Arora, N. S.
2017-12-01
The Bayesian approach to Association focuses mainly on quantifying the physics of the domain. In the case of seismic association for instance let X be the set of all significant events (above some threshold) and their attributes, such as location, time, and magnitude, Y1 be the set of detections that are caused by significant events and their attributes such as seismic phase, arrival time, amplitude etc., Y2 be the set of detections that are not caused by significant events, and finally Y be the set of observed detections We would now define the joint distribution P(X, Y1, Y2, Y) = P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2) ; where the last term simply states that Y1 and Y2 are a partitioning of Y. Given the above joint distribution the inference problem is simply to find the X, Y1, and Y2 that maximizes posterior probability P(X, Y1, Y2| Y) which reduces to maximizing P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2). In this expression P(X) captures our prior belief about event locations. P(Y1 | X) captures notions of travel time, residual error distributions as well as detection and mis-detection probabilities. While P(Y2) captures the false detection rate of our seismic network. The elegance of this approach is that all of the assumptions are stated clearly in the model for P(X), P(Y1|X) and P(Y2). The implementation of the inference is merely a by-product of this model. In contrast some of the other methods such as GA hide a number of assumptions in the implementation details of the inference - such as the so called "driver cells." The other important aspect of this approach is that all seismic knowledge including knowledge from other domains such as infrasound and hydroacoustic can be included in the same model. So, we don't need to separately account for misdetections or merge seismic and infrasound events as a separate step. Finally, it should be noted that the objective of automatic association is to simplify the job of humans who are publishing seismic bulletins based on this
Risk-Based Operation and Maintenance Using Bayesian Networks
DEFF Research Database (Denmark)
Nielsen, Jannie Jessen; Sørensen, John Dalsgaard
2011-01-01
This paper describes how risk-based decision making can be used for maintenance planning of components exposed to degradation such as fatigue in offshore wind turbines. In fatigue models, large epistemic uncertainties are usually present. These can be reduced if monitoring results are used to upd...
Schwartz, Jacob
1978-01-01
An improved long-life design for solar energy receivers provides for greatly reduced thermally induced stress and permits the utilization of less expensive heat exchanger materials while maintaining receiver efficiencies in excess of 85% without undue expenditure of energy to circulate the working fluid. In one embodiment, the flow index for the receiver is first set as close as practical to a value such that the Graetz number yields the optimal heat transfer coefficient per unit of pumping energy, in this case, 6. The convective index for the receiver is then set as closely as practical to two times the flow index so as to obtain optimal efficiency per unit mass of material.
Cryogenic microwave channelized receiver
International Nuclear Information System (INIS)
Rauscher, C.; Pond, J.M.; Tait, G.B.
1996-01-01
The channelized receiver being presented demonstrates the use of high temperature superconductor technology in a microwave system setting where superconductor, microwave-monolithic-integrated-circuit, and hybrid-integrated-circuit components are united in one package and cooled to liquid-nitrogen temperatures. The receiver consists of a superconducting X-band four-channel demultiplexer with 100-MHz-wide channels, four commercial monolithically integrated mixers, and four custom-designed hybrid-circuit detectors containing heterostructure ramp diodes. The composite receiver unit has been integrated into the payload of the second-phase NRL high temperature superconductor space experiment (HTSSE-II). Prior to payload assembly, the response characteristics of the receiver were measured as functions of frequency, temperature, and drive levels. The article describes the circuitry, discusses the key issues related to design and implementation, and summarizes the experimental results
National Research Council Canada - National Science Library
Wilkerson, Thomas
2000-01-01
...". The chosen vendor, Orca Photonics, In. (Redmond, WA), in close collaboration with USU personnel, built a portable, computerized lidar system that not only is suitable as a receiver for a near IR alexandrite laser, but also contains an independent Nd...
Receiver Gain Modulation Circuit
Jones, Hollis; Racette, Paul; Walker, David; Gu, Dazhen
2011-01-01
A receiver gain modulation circuit (RGMC) was developed that modulates the power gain of the output of a radiometer receiver with a test signal. As the radiometer receiver switches between calibration noise references, the test signal is mixed with the calibrated noise and thus produces an ensemble set of measurements from which ensemble statistical analysis can be used to extract statistical information about the test signal. The RGMC is an enabling technology of the ensemble detector. As a key component for achieving ensemble detection and analysis, the RGMC has broad aeronautical and space applications. The RGMC can be used to test and develop new calibration algorithms, for example, to detect gain anomalies, and/or correct for slow drifts that affect climate-quality measurements over an accelerated time scale. A generalized approach to analyzing radiometer system designs yields a mathematical treatment of noise reference measurements in calibration algorithms. By treating the measurements from the different noise references as ensemble samples of the receiver state, i.e. receiver gain, a quantitative description of the non-stationary properties of the underlying receiver fluctuations can be derived. Excellent agreement has been obtained between model calculations and radiometric measurements. The mathematical formulation is equivalent to modulating the gain of a stable receiver with an externally generated signal and is the basis for ensemble detection and analysis (EDA). The concept of generating ensemble data sets using an ensemble detector is similar to the ensemble data sets generated as part of ensemble empirical mode decomposition (EEMD) with exception of a key distinguishing factor. EEMD adds noise to the signal under study whereas EDA mixes the signal with calibrated noise. It is mixing with calibrated noise that permits the measurement of temporal-functional variability of uncertainty in the underlying process. The RGMC permits the evaluation of EDA by
Bayesian uncertainty analyses of probabilistic risk models
International Nuclear Information System (INIS)
Pulkkinen, U.
1989-01-01
Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed
Justifying Objective Bayesianism on Predicate Languages
Directory of Open Access Journals (Sweden)
Jürgen Landes
2015-04-01
Full Text Available Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism to inductive logic. We show that the maximum entropy principle can be motivated largely in terms of minimising worst-case expected loss.
Motion Learning Based on Bayesian Program Learning
Directory of Open Access Journals (Sweden)
Cheng Meng-Zhen
2017-01-01
Full Text Available The concept of virtual human has been highly anticipated since the 1980s. By using computer technology, Human motion simulation could generate authentic visual effect, which could cheat human eyes visually. Bayesian Program Learning train one or few motion data, generate new motion data by decomposing and combining. And the generated motion will be more realistic and natural than the traditional one.In this paper, Motion learning based on Bayesian program learning allows us to quickly generate new motion data, reduce workload, improve work efficiency, reduce the cost of motion capture, and improve the reusability of data.
Nonparametric Bayesian Modeling of Complex Networks
DEFF Research Database (Denmark)
Schmidt, Mikkel Nørgaard; Mørup, Morten
2013-01-01
an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...
Length Scales in Bayesian Automatic Adaptive Quadrature
Directory of Open Access Journals (Sweden)
Adam Gh.
2016-01-01
Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.
Bayesian parameter estimation in probabilistic risk assessment
International Nuclear Information System (INIS)
Siu, Nathan O.; Kelly, Dana L.
1998-01-01
Bayesian statistical methods are widely used in probabilistic risk assessment (PRA) because of their ability to provide useful estimates of model parameters when data are sparse and because the subjective probability framework, from which these methods are derived, is a natural framework to address the decision problems motivating PRA. This paper presents a tutorial on Bayesian parameter estimation especially relevant to PRA. It summarizes the philosophy behind these methods, approaches for constructing likelihood functions and prior distributions, some simple but realistic examples, and a variety of cautions and lessons regarding practical applications. References are also provided for more in-depth coverage of various topics
Bayesian estimation and tracking a practical guide
Haug, Anton J
2012-01-01
A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation
Zador, Zsolt; Huang, Wendy; Sperrin, Matthew; Lawton, Michael T
2018-06-01
Following the International Subarachnoid Aneurysm Trial (ISAT), evolving treatment modalities for acute aneurysmal subarachnoid hemorrhage (aSAH) has changed the case mix of patients undergoing urgent surgical clipping. To update our knowledge on outcome predictors by analyzing admission parameters in a pure surgical series using variable importance ranking and machine learning. We reviewed a single surgeon's case series of 226 patients suffering from aSAH treated with urgent surgical clipping. Predictions were made using logistic regression models, and predictive performance was assessed using areas under the receiver operating curve (AUC). We established variable importance ranking using partial Nagelkerke R2 scores. Probabilistic associations between variables were depicted using Bayesian networks, a method of machine learning. Importance ranking showed that World Federation of Neurosurgical Societies (WFNS) grade and age were the most influential outcome prognosticators. Inclusion of only these 2 predictors was sufficient to maintain model performance compared to when all variables were considered (AUC = 0.8222, 95% confidence interval (CI): 0.7646-0.88 vs 0.8218, 95% CI: 0.7616-0.8821, respectively, DeLong's P = .992). Bayesian networks showed that age and WFNS grade were associated with several variables such as laboratory results and cardiorespiratory parameters. Our study is the first to report early outcomes and formal predictor importance ranking following aSAH in a post-ISAT surgical case series. Models showed good predictive power with fewer relevant predictors than in similar size series. Bayesian networks proved to be a powerful tool in visualizing the widespread association of the 2 key predictors with admission variables, explaining their importance and demonstrating the potential for hypothesis generation.
Bayesian fault detection and isolation using Field Kalman Filter
Baranowski, Jerzy; Bania, Piotr; Prasad, Indrajeet; Cong, Tian
2017-12-01
Fault detection and isolation is crucial for the efficient operation and safety of any industrial process. There is a variety of methods from all areas of data analysis employed to solve this kind of task, such as Bayesian reasoning and Kalman filter. In this paper, the authors use a discrete Field Kalman Filter (FKF) to detect and recognize faulty conditions in a system. The proposed approach, devised for stochastic linear systems, allows for analysis of faults that can be expressed both as parameter and disturbance variations. This approach is formulated for the situations when the fault catalog is known, resulting in the algorithm allowing estimation of probability values. Additionally, a variant of algorithm with greater numerical robustness is presented, based on computation of logarithmic odds. Proposed algorithm operation is illustrated with numerical examples, and both its merits and limitations are critically discussed and compared with traditional EKF.
Forecasting the 2012 and 2014 Elections Using Bayesian Prediction and Optimization
Directory of Open Access Journals (Sweden)
Steven E. Rigdon
2015-04-01
Full Text Available This article presents a data-driven Bayesian model used to predict the state-by-state winners in the Senate and presidential elections in 2012 and 2014. The Bayesian model takes into account the proportions of polled subjects who favor each candidate and the proportion who are undecided, and produces a posterior probability that each candidate will win each state. From this, a dynamic programming algorithm is used to compute the probability mass functions for the number of electoral votes that each presidential candidate receives and the number of Senate seats that each party receives. On the final day before the 2012 election, the model gave a probability of (essentially one that President Obama would be reelected, and that the Democrats would retain control of the U.S. Senate. In 2014, the model gave a final probability of .99 that the Republicans would take control of the Senate.
A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri
2013-01-01
representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...
A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research
Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,
A gentle introduction to Bayesian analysis : Applications to developmental research
van de Schoot, R.; Kaplan, D.; Denissen, J.J.A.; Asendorpf, J.B.; Neyer, F.J.; van Aken, M.A.G.
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,
A default Bayesian hypothesis test for ANOVA designs
Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.
2012-01-01
This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA
Prior approval: the growth of Bayesian methods in psychology.
Andrews, Mark; Baguley, Thom
2013-02-01
Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.
Strong quantum solutions in conflicting-interest Bayesian games
Rai, Ashutosh; Paul, Goutam
2017-10-01
Quantum entanglement has been recently demonstrated as a useful resource in conflicting-interest games of incomplete information between two players, Alice and Bob [Pappa et al., Phys. Rev. Lett. 114, 020401 (2015), 10.1103/PhysRevLett.114.020401]. The general setting for such games is that of correlated strategies where the correlation between competing players is established through a trusted common adviser; however, players need not reveal their input to the adviser. So far, the quantum advantage in such games has been revealed in a restricted sense. Given a quantum correlated equilibrium strategy, one of the players can still receive a higher than quantum average payoff with some classically correlated equilibrium strategy. In this work, by considering a class of asymmetric Bayesian games, we show the existence of games with quantum correlated equilibrium where the average payoff of both the players exceeds the respective individual maximum for each player over all classically correlated equilibriums.
'Chaos' in superregenerative receivers
International Nuclear Information System (INIS)
Commercon, Jean-Claude; Badard, Robert
2005-01-01
The superregenerative principle has been known since the early 1920s. The circuit is extremely simple and extremely sensitive. Today, superheterodyne receivers generally supplant superregenerative receivers in most applications because there are several undesirable characteristics: poor selectivity, reradiation, etc. Superregenerative receivers undergo a revival in recent papers for wireless systems, where low cost and very low power consumption are relevant: house/building meters (such as water, energy, gas counter), personal computer environment (keyboard, mouse), etc. Another drawback is the noise level which is higher than that of a well-designed superheterodyne receiver; without an antenna input signal, the output of the receiver hears in an earphone as a waterfall noise; this sound principally is the inherent input noise amplified and detected by the circuit; however, when the input noise is negligible with respect of an antenna input signal, we are faced to an other source of 'noise' self-generated by the superregenerative working. The main objective of this paper concerns this self-generated noise coming from an exponential growing followed by a re-injection process for which the final state is a function of the phase of the input signal
Inferring on the Intentions of Others by Hierarchical Bayesian Learning
Diaconescu, Andreea O.; Mathys, Christoph; Weber, Lilian A. E.; Daunizeau, Jean; Kasper, Lars; Lomakina, Ekaterina I.; Fehr, Ernst; Stephan, Klaas E.
2014-01-01
Inferring on others' (potentially time-varying) intentions is a fundamental problem during many social transactions. To investigate the underlying mechanisms, we applied computational modeling to behavioral data from an economic game in which 16 pairs of volunteers (randomly assigned to “player” or “adviser” roles) interacted. The player performed a probabilistic reinforcement learning task, receiving information about a binary lottery from a visual pie chart. The adviser, who received more predictive information, issued an additional recommendation. Critically, the game was structured such that the adviser's incentives to provide helpful or misleading information varied in time. Using a meta-Bayesian modeling framework, we found that the players' behavior was best explained by the deployment of hierarchical learning: they inferred upon the volatility of the advisers' intentions in order to optimize their predictions about the validity of their advice. Beyond learning, volatility estimates also affected the trial-by-trial variability of decisions: participants were more likely to rely on their estimates of advice accuracy for making choices when they believed that the adviser's intentions were presently stable. Finally, our model of the players' inference predicted the players' interpersonal reactivity index (IRI) scores, explicit ratings of the advisers' helpfulness and the advisers' self-reports on their chosen strategy. Overall, our results suggest that humans (i) employ hierarchical generative models to infer on the changing intentions of others, (ii) use volatility estimates to inform decision-making in social interactions, and (iii) integrate estimates of advice accuracy with non-social sources of information. The Bayesian framework presented here can quantify individual differences in these mechanisms from simple behavioral readouts and may prove useful in future clinical studies of maladaptive social cognition. PMID:25187943
Posterior consistency for Bayesian inverse problems through stability and regression results
International Nuclear Information System (INIS)
Vollmer, Sebastian J
2013-01-01
In the Bayesian approach, the a priori knowledge about the input of a mathematical model is described via a probability measure. The joint distribution of the unknown input and the data is then conditioned, using Bayes’ formula, giving rise to the posterior distribution on the unknown input. In this setting we prove posterior consistency for nonlinear inverse problems: a sequence of data is considered, with diminishing fluctuations around a single truth and it is then of interest to show that the resulting sequence of posterior measures arising from this sequence of data concentrates around the truth used to generate the data. Posterior consistency justifies the use of the Bayesian approach very much in the same way as error bounds and convergence results for regularization techniques do. As a guiding example, we consider the inverse problem of reconstructing the diffusion coefficient from noisy observations of the solution to an elliptic PDE in divergence form. This problem is approached by splitting the forward operator into the underlying continuum model and a simpler observation operator based on the output of the model. In general, these splittings allow us to conclude posterior consistency provided a deterministic stability result for the underlying inverse problem and a posterior consistency result for the Bayesian regression problem with the push-forward prior. Moreover, we prove posterior consistency for the Bayesian regression problem based on the regularity, the tail behaviour and the small ball probabilities of the prior. (paper)
Statistical Bayesian method for reliability evaluation based on ADT data
Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong
2018-05-01
Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.
DATMAN: A reliability data analysis program using Bayesian updating
International Nuclear Information System (INIS)
Becker, M.; Feltus, M.A.
1996-01-01
Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, which can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately
MODELING INFORMATION SYSTEM AVAILABILITY BY USING BAYESIAN BELIEF NETWORK APPROACH
Directory of Open Access Journals (Sweden)
Semir Ibrahimović
2016-03-01
Full Text Available Modern information systems are expected to be always-on by providing services to end-users, regardless of time and location. This is particularly important for organizations and industries where information systems support real-time operations and mission-critical applications that need to be available on 24 7 365 basis. Examples of such entities include process industries, telecommunications, healthcare, energy, banking, electronic commerce and a variety of cloud services. This article presents a modified Bayesian Belief Network model for predicting information system availability, introduced initially by Franke, U. and Johnson, P. (in article “Availability of enterprise IT systems – an expert based Bayesian model”. Software Quality Journal 20(2, 369-394, 2012 based on a thorough review of several dimensions of the information system availability, we proposed a modified set of determinants. The model is parameterized by using probability elicitation process with the participation of experts from the financial sector of Bosnia and Herzegovina. The model validation was performed using Monte Carlo simulation.
Bayesian Meta-Analysis of Coefficient Alpha
Brannick, Michael T.; Zhang, Nanhua
2013-01-01
The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…
Bayesian decision theory : A simple toy problem
van Erp, H.R.N.; Linger, R.O.; van Gelder, P.H.A.J.M.
2016-01-01
We give here a comparison of the expected outcome theory, the expected utility theory, and the Bayesian decision theory, by way of a simple numerical toy problem in which we look at the investment willingness to avert a high impact low probability event. It will be found that for this toy problem
Optimal Detection under the Restricted Bayesian Criterion
Directory of Open Access Journals (Sweden)
Shujun Liu
2017-07-01
Full Text Available This paper aims to find a suitable decision rule for a binary composite hypothesis-testing problem with a partial or coarse prior distribution. To alleviate the negative impact of the information uncertainty, a constraint is considered that the maximum conditional risk cannot be greater than a predefined value. Therefore, the objective of this paper becomes to find the optimal decision rule to minimize the Bayes risk under the constraint. By applying the Lagrange duality, the constrained optimization problem is transformed to an unconstrained optimization problem. In doing so, the restricted Bayesian decision rule is obtained as a classical Bayesian decision rule corresponding to a modified prior distribution. Based on this transformation, the optimal restricted Bayesian decision rule is analyzed and the corresponding algorithm is developed. Furthermore, the relation between the Bayes risk and the predefined value of the constraint is also discussed. The Bayes risk obtained via the restricted Bayesian decision rule is a strictly decreasing and convex function of the constraint on the maximum conditional risk. Finally, the numerical results including a detection example are presented and agree with the theoretical results.
Heuristics as Bayesian inference under extreme priors.
Parpart, Paula; Jones, Matt; Love, Bradley C
2018-05-01
Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
A strongly quasiconvex PAC-Bayesian bound
DEFF Research Database (Denmark)
Thiemann, Niklas; Igel, Christian; Wintenberger, Olivier
2017-01-01
We propose a new PAC-Bayesian bound and a way of constructing a hypothesis space, so that the bound is convex in the posterior distribution and also convex in a trade-off parameter between empirical performance of the posterior distribution and its complexity. The complexity is measured by the Ku...
Multisnapshot Sparse Bayesian Learning for DOA
DEFF Research Database (Denmark)
Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki
2016-01-01
The directions of arrival (DOA) of plane waves are estimated from multisnapshot sensor array data using sparse Bayesian learning (SBL). The prior for the source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters, the unknown variances (i.e., the source...
Approximate Bayesian evaluations of measurement uncertainty
Possolo, Antonio; Bodnar, Olha
2018-04-01
The Guide to the Expression of Uncertainty in Measurement (GUM) includes formulas that produce an estimate of a scalar output quantity that is a function of several input quantities, and an approximate evaluation of the associated standard uncertainty. This contribution presents approximate, Bayesian counterparts of those formulas for the case where the output quantity is a parameter of the joint probability distribution of the input quantities, also taking into account any information about the value of the output quantity available prior to measurement expressed in the form of a probability distribution on the set of possible values for the measurand. The approximate Bayesian estimates and uncertainty evaluations that we present have a long history and illustrious pedigree, and provide sufficiently accurate approximations in many applications, yet are very easy to implement in practice. Differently from exact Bayesian estimates, which involve either (analytical or numerical) integrations, or Markov Chain Monte Carlo sampling, the approximations that we describe involve only numerical optimization and simple algebra. Therefore, they make Bayesian methods widely accessible to metrologists. We illustrate the application of the proposed techniques in several instances of measurement: isotopic ratio of silver in a commercial silver nitrate; odds of cryptosporidiosis in AIDS patients; height of a manometer column; mass fraction of chromium in a reference material; and potential-difference in a Zener voltage standard.
Inverse Problems in a Bayesian Setting
Matthies, Hermann G.
2016-02-13
In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.
Error probabilities in default Bayesian hypothesis testing
Gu, Xin; Hoijtink, Herbert; Mulder, J,
2016-01-01
This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for
Bayesian Averaging is Well-Temperated
DEFF Research Database (Denmark)
Hansen, Lars Kai
2000-01-01
Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation is l...
Robust bayesian inference of generalized Pareto distribution ...
African Journals Online (AJOL)
En utilisant une etude exhaustive de Monte Carlo, nous prouvons que, moyennant une fonction perte generalisee adequate, on peut construire un estimateur Bayesien robuste du modele. Key words: Bayesian estimation; Extreme value; Generalized Fisher information; Gener- alized Pareto distribution; Monte Carlo; ...
Evidence Estimation for Bayesian Partially Observed MRFs
Chen, Y.; Welling, M.
2013-01-01
Bayesian estimation in Markov random fields is very hard due to the intractability of the partition function. The introduction of hidden units makes the situation even worse due to the presence of potentially very many modes in the posterior distribution. For the first time we propose a
Inverse Problems in a Bayesian Setting
Matthies, Hermann G.; Zander, Elmar; Rosić, Bojana V.; Litvinenko, Alexander; Pajonk, Oliver
2016-01-01
In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.
A Bayesian perspective on some replacement strategies
International Nuclear Information System (INIS)
Mazzuchi, Thomas A.; Soyer, Refik
1996-01-01
In this paper we present a Bayesian decision theoretic approach for determining optimal replacement strategies. This approach enables us to formally incorporate, express, and update our uncertainty when determining optimal replacement strategies. We develop relevant expressions for both the block replacement protocol with minimal repair and the age replacement protocol and illustrate the use of our approach with real data
Comparison between Fisherian and Bayesian approach to ...
African Journals Online (AJOL)
... of its simplicity and optimality properties is normally used for two group cases. However, Bayesian approach is found to be better than Fisher's approach because of its low misclassification error rate. Keywords: variance-covariance matrices, centroids, prior probability, mahalanobis distance, probability of misclassification ...
BAYESIAN ESTIMATION OF THERMONUCLEAR REACTION RATES
Energy Technology Data Exchange (ETDEWEB)
Iliadis, C.; Anderson, K. S. [Department of Physics and Astronomy, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-3255 (United States); Coc, A. [Centre de Sciences Nucléaires et de Sciences de la Matière (CSNSM), CNRS/IN2P3, Univ. Paris-Sud, Université Paris–Saclay, Bâtiment 104, F-91405 Orsay Campus (France); Timmes, F. X.; Starrfield, S., E-mail: iliadis@unc.edu [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1504 (United States)
2016-11-01
The problem of estimating non-resonant astrophysical S -factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present astrophysical S -factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p, γ ){sup 3}He, {sup 3}He({sup 3}He,2p){sup 4}He, and {sup 3}He( α , γ ){sup 7}Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.
Non-Linear Approximation of Bayesian Update
Litvinenko, Alexander
2016-01-01
We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.
Comprehension and computation in Bayesian problem solving
Directory of Open Access Journals (Sweden)
Eric D. Johnson
2015-07-01
Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.
A Bayesian Approach to Interactive Retrieval
Tague, Jean M.
1973-01-01
A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…
Encoding dependence in Bayesian causal networks
Bayesian networks (BNs) represent complex, uncertain spatio-temporal dynamics by propagation of conditional probabilities between identifiable states with a testable causal interaction model. Typically, they assume random variables are discrete in time and space with a static network structure that ...
Forecasting nuclear power supply with Bayesian autoregression
International Nuclear Information System (INIS)
Beck, R.; Solow, J.L.
1994-01-01
We explore the possibility of forecasting the quarterly US generation of electricity from nuclear power using a Bayesian autoregression model. In terms of forecasting accuracy, this approach compares favorably with both the Department of Energy's current forecasting methodology and their more recent efforts using ARIMA models, and it is extremely easy and inexpensive to implement. (author)
A Bayesian Nonparametric Approach to Factor Analysis
DEFF Research Database (Denmark)
Piatek, Rémi; Papaspiliopoulos, Omiros
2018-01-01
This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...
Hierarchical Bayesian Models of Subtask Learning
Anglim, Jeromy; Wynton, Sarah K. A.
2015-01-01
The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking…
Non-Linear Approximation of Bayesian Update
Litvinenko, Alexander
2016-06-23
We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.
Bayesian noninferiority test for 2 binomial probabilities as the extension of Fisher exact test.
Doi, Masaaki; Takahashi, Fumihiro; Kawasaki, Yohei
2017-12-30
Noninferiority trials have recently gained importance for the clinical trials of drugs and medical devices. In these trials, most statistical methods have been used from a frequentist perspective, and historical data have been used only for the specification of the noninferiority margin Δ>0. In contrast, Bayesian methods, which have been studied recently are advantageous in that they can use historical data to specify prior distributions and are expected to enable more efficient decision making than frequentist methods by borrowing information from historical trials. In the case of noninferiority trials for response probabilities π 1 ,π 2 , Bayesian methods evaluate the posterior probability of H 1 :π 1 >π 2 -Δ being true. To numerically calculate such posterior probability, complicated Appell hypergeometric function or approximation methods are used. Further, the theoretical relationship between Bayesian and frequentist methods is unclear. In this work, we give the exact expression of the posterior probability of the noninferiority under some mild conditions and propose the Bayesian noninferiority test framework which can flexibly incorporate historical data by using the conditional power prior. Further, we show the relationship between Bayesian posterior probability and the P value of the Fisher exact test. From this relationship, our method can be interpreted as the Bayesian noninferior extension of the Fisher exact test, and we can treat superiority and noninferiority in the same framework. Our method is illustrated through Monte Carlo simulations to evaluate the operating characteristics, the application to the real HIV clinical trial data, and the sample size calculation using historical data. Copyright © 2017 John Wiley & Sons, Ltd.
Solar thermal central receivers
International Nuclear Information System (INIS)
Vant-Hull, L.L.
1993-01-01
Market issues, environmental impact, and technology issues related to the Solar Central Receiver concept are addressed. The rationale for selection of the preferred configuration and working fluid are presented as the result of a joint utility-industry analysis. A $30 million conversion of Solar One to an external molten salt receiver would provide the intermediate step to a commercial demonstration plant. The first plant in this series could produce electricity at 11.2 cents/kWhr and the seventh at 8.2 cents/kWhr, completely competitive with projected costs of new utility plants in 1992
Bayesian Correlation Analysis for Sequence Count Data.
Directory of Open Access Journals (Sweden)
Daniel Sánchez-Taltavull
Full Text Available Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities' measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low-especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities' signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset.
Receiver gain function: the actual NMR receiver gain
Mo, Huaping; Harwood, John S.; Raftery, Daniel
2010-01-01
The observed NMR signal size depends on the receiver gain parameter. We propose a receiver gain function to characterize how much the raw FID is amplified by the receiver as a function of the receiver gain setting. Although the receiver is linear for a fixed gain setting, the actual gain of the receiver may differ from what the gain setting suggests. Nevertheless, for a given receiver, we demonstrate that the receiver gain function can be calibrated. Such a calibration enables accurate compar...
Campus Projects Receiving "Earmarks."
Schonberger, Benjamin
1991-01-01
Specific campus projects that Congress has directed federal agencies to support this year at over 120 colleges and universities are listed. The agencies neither requested support nor sponsored merit-based competitions for the awards. In some cases, the institutions have a history of receiving special federal treatment. (MSE)
Radiation Source Mapping with Bayesian Inverse Methods
Hykes, Joshua Michael
We present a method to map the spectral and spatial distributions of radioactive sources using a small number of detectors. Locating and identifying radioactive materials is important for border monitoring, accounting for special nuclear material in processing facilities, and in clean-up operations. Most methods to analyze these problems make restrictive assumptions about the distribution of the source. In contrast, the source-mapping method presented here allows an arbitrary three-dimensional distribution in space and a flexible group and gamma peak distribution in energy. To apply the method, the system's geometry and materials must be known. A probabilistic Bayesian approach is used to solve the resulting inverse problem (IP) since the system of equations is ill-posed. The probabilistic approach also provides estimates of the confidence in the final source map prediction. A set of adjoint flux, discrete ordinates solutions, obtained in this work by the Denovo code, are required to efficiently compute detector responses from a candidate source distribution. These adjoint fluxes are then used to form the linear model to map the state space to the response space. The test for the method is simultaneously locating a set of 137Cs and 60Co gamma sources in an empty room. This test problem is solved using synthetic measurements generated by a Monte Carlo (MCNP) model and using experimental measurements that we collected for this purpose. With the synthetic data, the predicted source distributions identified the locations of the sources to within tens of centimeters, in a room with an approximately four-by-four meter floor plan. Most of the predicted source intensities were within a factor of ten of their true value. The chi-square value of the predicted source was within a factor of five from the expected value based on the number of measurements employed. With a favorable uniform initial guess, the predicted source map was nearly identical to the true distribution
Development of a Bayesian model to estimate health care outcomes in the severely wounded
Directory of Open Access Journals (Sweden)
Alexander Stojadinovic
2010-08-01
Full Text Available Alexander Stojadinovic1, John Eberhardt2, Trevor S Brown3, Jason S Hawksworth4, Frederick Gage3, Douglas K Tadaki3, Jonathan A Forsberg5, Thomas A Davis3, Benjamin K Potter5, James R Dunne6, E A Elster31Combat Wound Initiative Program, 4Department of Surgery, Walter Reed Army Medical Center, Washington, DC, USA; 2DecisionQ Corporation, Washington, DC, USA; 3Regenerative Medicine Department, Combat Casualty Care, Naval Medical Research Center, Silver Spring, MD, USA; 5Integrated Department of Orthopaedics and Rehabilitation, 6Department of Surgery, National Naval Medical Center, Bethesda, MD, USABackground: Graphical probabilistic models have the ability to provide insights as to how clinical factors are conditionally related. These models can be used to help us understand factors influencing health care outcomes and resource utilization, and to estimate morbidity and clinical outcomes in trauma patient populations.Study design: Thirty-two combat casualties with severe extremity injuries enrolled in a prospective observational study were analyzed using step-wise machine-learned Bayesian belief network (BBN and step-wise logistic regression (LR. Models were evaluated using 10-fold cross-validation to calculate area-under-the-curve (AUC from receiver operating characteristics (ROC curves.Results: Our BBN showed important associations between various factors in our data set that could not be developed using standard regression methods. Cross-validated ROC curve analysis showed that our BBN model was a robust representation of our data domain and that LR models trained on these findings were also robust: hospital-acquired infection (AUC: LR, 0.81; BBN, 0.79, intensive care unit length of stay (AUC: LR, 0.97; BBN, 0.81, and wound healing (AUC: LR, 0.91; BBN, 0.72 showed strong AUC.Conclusions: A BBN model can effectively represent clinical outcomes and biomarkers in patients hospitalized after severe wounding, and is confirmed by 10-fold
Can natural selection encode Bayesian priors?
Ramírez, Juan Camilo; Marshall, James A R
2017-08-07
The evolutionary success of many organisms depends on their ability to make decisions based on estimates of the state of their environment (e.g., predation risk) from uncertain information. These decision problems have optimal solutions and individuals in nature are expected to evolve the behavioural mechanisms to make decisions as if using the optimal solutions. Bayesian inference is the optimal method to produce estimates from uncertain data, thus natural selection is expected to favour individuals with the behavioural mechanisms to make decisions as if they were computing Bayesian estimates in typically-experienced environments, although this does not necessarily imply that favoured decision-makers do perform Bayesian computations exactly. Each individual should evolve to behave as if updating a prior estimate of the unknown environment variable to a posterior estimate as it collects evidence. The prior estimate represents the decision-maker's default belief regarding the environment variable, i.e., the individual's default 'worldview' of the environment. This default belief has been hypothesised to be shaped by natural selection and represent the environment experienced by the individual's ancestors. We present an evolutionary model to explore how accurately Bayesian prior estimates can be encoded genetically and shaped by natural selection when decision-makers learn from uncertain information. The model simulates the evolution of a population of individuals that are required to estimate the probability of an event. Every individual has a prior estimate of this probability and collects noisy cues from the environment in order to update its prior belief to a Bayesian posterior estimate with the evidence gained. The prior is inherited and passed on to offspring. Fitness increases with the accuracy of the posterior estimates produced. Simulations show that prior estimates become accurate over evolutionary time. In addition to these 'Bayesian' individuals, we also
Bayesian network learning for natural hazard assessments
Vogel, Kristin
2016-04-01
Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables
A Bayesian analysis of the nucleon QCD sum rules
International Nuclear Information System (INIS)
Ohtani, Keisuke; Gubler, Philipp; Oka, Makoto
2011-01-01
QCD sum rules of the nucleon channel are reanalyzed, using the maximum-entropy method (MEM). This new approach, based on the Bayesian probability theory, does not restrict the spectral function to the usual ''pole + continuum'' form, allowing a more flexible investigation of the nucleon spectral function. Making use of this flexibility, we are able to investigate the spectral functions of various interpolating fields, finding that the nucleon ground state mainly couples to an operator containing a scalar diquark. Moreover, we formulate the Gaussian sum rule for the nucleon channel and find that it is more suitable for the MEM analysis to extract the nucleon pole in the region of its experimental value, while the Borel sum rule does not contain enough information to clearly separate the nucleon pole from the continuum. (orig.)
Bayesian statistics for the calibration of the LISA Pathfinder experiment
International Nuclear Information System (INIS)
Armano, M; Freschi, M; Audley, H; Born, M; Danzmann, K; Diepholz, I; Auger, G; Binetruy, P; Bortoluzzi, D; Brandt, N; Fitzsimons, E; Bursi, A; Caleno, M; Cavalleri, A; Cesarini, A; Dolesi, R; Ferroni, V; Cruise, M; Dunbar, N; Ferraioli, L
2015-01-01
The main goal of LISA Pathfinder (LPF) mission is to estimate the acceleration noise models of the overall LISA Technology Package (LTP) experiment on-board. This will be of crucial importance for the future space-based Gravitational-Wave (GW) detectors, like eLISA. Here, we present the Bayesian analysis framework to process the planned system identification experiments designed for that purpose. In particular, we focus on the analysis strategies to predict the accuracy of the parameters that describe the system in all degrees of freedom. The data sets were generated during the latest operational simulations organised by the data analysis team and this work is part of the LTPDA Matlab toolbox. (paper)
International Nuclear Information System (INIS)
Peng, Weiwen; Li, Yan-Feng; Mi, Jinhua; Yu, Le; Huang, Hong-Zhong
2016-01-01
Degradation analysis is critical to reliability assessment and operational management of complex systems. Two types of assumptions are often adopted for degradation analysis: (1) single degradation indicator and (2) constant external factors. However, modern complex systems are generally characterized as multiple functional and suffered from multiple failure modes due to dynamic operating conditions. In this paper, Bayesian degradation analysis of complex systems with multiple degradation indicators under dynamic conditions is investigated. Three practical engineering-driven issues are addressed: (1) to model various combinations of degradation indicators, a generalized multivariate hybrid degradation process model is proposed, which subsumes both monotonic and non-monotonic degradation processes models as special cases, (2) to study effects of external factors, two types of dynamic covariates are incorporated jointly, which include both environmental conditions and operating profiles, and (3) to facilitate degradation based reliability analysis, a serial of Bayesian strategy is constructed, which covers parameter estimation, factor-related degradation prediction, and unit-specific remaining useful life assessment. Finally, degradation analysis of a type of heavy machine tools is presented to demonstrate the application and performance of the proposed method. A comparison of the proposed model with a traditional model is studied as well in the example. - Highlights: • A generalized multivariate hybrid degradation process model is introduced. • Various types of dependent degradation processes can be modeled coherently. • The effects of environmental conditions and operating profiles are investigated. • Unit-specific RUL assessment is implemented through a two-step Bayesian method.
Directory of Open Access Journals (Sweden)
Ata Khan
2013-04-01
Full Text Available Intelligent transportation systems (ITS are gaining acceptance around the world and the connected vehicle component of ITS is recognized as a high priority research and development area in many technologically advanced countries. Connected vehicles are expected to have the capability of safe, efficient and eco-driving operations whether these are under human control or in the adaptive machine control mode of operations. The race is on to design the capability to operate in connected traffic environment. The operational requirements can be met with cognitive vehicle design features made possible by advances in artificial intelligence-supported methodology, improved understanding of human factors, and advances in communication technology. This paper describes cognitive features and their information system requirements. The architecture of an information system is presented that supports the features of the cognitive connected vehicle. For better focus, information processing capabilities are specified and the role of Bayesian artificial intelligence is defined for data fusion. Example applications illustrate the role of information systems in integrating intelligent technology, Bayesian artificial intelligence, and abstracted human factors. Concluding remarks highlight the role of the information system and Bayesian artificial intelligence in the design of a new generation of cognitive connected vehicle.
Prior elicitation and Bayesian analysis of the Steroids for Corneal Ulcers Trial.
See, Craig W; Srinivasan, Muthiah; Saravanan, Somu; Oldenburg, Catherine E; Esterberg, Elizabeth J; Ray, Kathryn J; Glaser, Tanya S; Tu, Elmer Y; Zegans, Michael E; McLeod, Stephen D; Acharya, Nisha R; Lietman, Thomas M
2012-12-01
To elicit expert opinion on the use of adjunctive corticosteroid therapy in bacterial corneal ulcers. To perform a Bayesian analysis of the Steroids for Corneal Ulcers Trial (SCUT), using expert opinion as a prior probability. The SCUT was a placebo-controlled trial assessing visual outcomes in patients receiving topical corticosteroids or placebo as adjunctive therapy for bacterial keratitis. Questionnaires were conducted at scientific meetings in India and North America to gauge expert consensus on the perceived benefit of corticosteroids as adjunct treatment. Bayesian analysis, using the questionnaire data as a prior probability and the primary outcome of SCUT as a likelihood, was performed. For comparison, an additional Bayesian analysis was performed using the results of the SCUT pilot study as a prior distribution. Indian respondents believed there to be a 1.21 Snellen line improvement, and North American respondents believed there to be a 1.24 line improvement with corticosteroid therapy. The SCUT primary outcome found a non-significant 0.09 Snellen line benefit with corticosteroid treatment. The results of the Bayesian analysis estimated a slightly greater benefit than did the SCUT primary analysis (0.19 lines verses 0.09 lines). Indian and North American experts had similar expectations on the effectiveness of corticosteroids in bacterial corneal ulcers; that corticosteroids would markedly improve visual outcomes. Bayesian analysis produced results very similar to those produced by the SCUT primary analysis. The similarity in result is likely due to the large sample size of SCUT and helps validate the results of SCUT.
Online variational Bayesian filtering-based mobile target tracking in wireless sensor networks.
Zhou, Bingpeng; Chen, Qingchun; Li, Tiffany Jing; Xiao, Pei
2014-11-11
The received signal strength (RSS)-based online tracking for a mobile node in wireless sensor networks (WSNs) is investigated in this paper. Firstly, a multi-layer dynamic Bayesian network (MDBN) is introduced to characterize the target mobility with either directional or undirected movement. In particular, it is proposed to employ the Wishart distribution to approximate the time-varying RSS measurement precision's randomness due to the target movement. It is shown that the proposed MDBN offers a more general analysis model via incorporating the underlying statistical information of both the target movement and observations, which can be utilized to improve the online tracking capability by exploiting the Bayesian statistics. Secondly, based on the MDBN model, a mean-field variational Bayesian filtering (VBF) algorithm is developed to realize the online tracking of a mobile target in the presence of nonlinear observations and time-varying RSS precision, wherein the traditional Bayesian filtering scheme cannot be directly employed. Thirdly, a joint optimization between the real-time velocity and its prior expectation is proposed to enable online velocity tracking in the proposed online tacking scheme. Finally, the associated Bayesian Cramer-Rao Lower Bound (BCRLB) analysis and numerical simulations are conducted. Our analysis unveils that, by exploiting the potential state information via the general MDBN model, the proposed VBF algorithm provides a promising solution to the online tracking of a mobile node in WSNs. In addition, it is shown that the final tracking accuracy linearly scales with its expectation when the RSS measurement precision is time-varying.
Message-Passing Receiver for OFDM Systems over Highly Delay-Dispersive Channels
DEFF Research Database (Denmark)
Barbu, Oana-Elena; Manchón, Carles Navarro; Rom, Christian
2017-01-01
Propagation channels with maximum excess delay exceeding the duration of the cyclic prefix (CP) in OFDM systems cause intercarrier and intersymbol interference which, unless accounted for, degrade the receiver performance. Using tools from Bayesian inference and sparse signal reconstruction, we...... derive an iterative algorithm that estimates an approximate representation of the channel impulse response and the noise variance, estimates and cancels the intrinsic interference and decodes the data over a block of symbols. Simulation results show that the receiver employing our algorithm outperforms...
BOP2: Bayesian optimal design for phase II clinical trials with simple and complex endpoints.
Zhou, Heng; Lee, J Jack; Yuan, Ying
2017-09-20
We propose a flexible Bayesian optimal phase II (BOP2) design that is capable of handling simple (e.g., binary) and complicated (e.g., ordinal, nested, and co-primary) endpoints under a unified framework. We use a Dirichlet-multinomial model to accommodate different types of endpoints. At each interim, the go/no-go decision is made by evaluating a set of posterior probabilities of the events of interest, which is optimized to maximize power or minimize the number of patients under the null hypothesis. Unlike other existing Bayesian designs, the BOP2 design explicitly controls the type I error rate, thereby bridging the gap between Bayesian designs and frequentist designs. In addition, the stopping boundary of the BOP2 design can be enumerated prior to the onset of the trial. These features make the BOP2 design accessible to a wide range of users and regulatory agencies and particularly easy to implement in practice. Simulation studies show that the BOP2 design has favorable operating characteristics with higher power and lower risk of incorrectly terminating the trial than some existing Bayesian phase II designs. The software to implement the BOP2 design is freely available at www.trialdesign.org. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Bayesianism and inference to the best explanation
Directory of Open Access Journals (Sweden)
Valeriano IRANZO
2008-01-01
Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.
Modelling dependable systems using hybrid Bayesian networks
International Nuclear Information System (INIS)
Neil, Martin; Tailor, Manesh; Marquez, David; Fenton, Norman; Hearty, Peter
2008-01-01
A hybrid Bayesian network (BN) is one that incorporates both discrete and continuous nodes. In our extensive applications of BNs for system dependability assessment, the models are invariably hybrid and the need for efficient and accurate computation is paramount. We apply a new iterative algorithm that efficiently combines dynamic discretisation with robust propagation algorithms on junction tree structures to perform inference in hybrid BNs. We illustrate its use in the field of dependability with two example of reliability estimation. Firstly we estimate the reliability of a simple single system and next we implement a hierarchical Bayesian model. In the hierarchical model we compute the reliability of two unknown subsystems from data collected on historically similar subsystems and then input the result into a reliability block model to compute system level reliability. We conclude that dynamic discretisation can be used as an alternative to analytical or Monte Carlo methods with high precision and can be applied to a wide range of dependability problems
Bayesian Peak Picking for NMR Spectra
Cheng, Yichen
2014-02-01
Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.
Bayesian component separation: The Planck experience
Wehus, Ingunn Kathrine; Eriksen, Hans Kristian
2018-05-01
Bayesian component separation techniques have played a central role in the data reduction process of Planck. The most important strength of this approach is its global nature, in which a parametric and physical model is fitted to the data. Such physical modeling allows the user to constrain very general data models, and jointly probe cosmological, astrophysical and instrumental parameters. This approach also supports statistically robust goodness-of-fit tests in terms of data-minus-model residual maps, which are essential for identifying residual systematic effects in the data. The main challenges are high code complexity and computational cost. Whether or not these costs are justified for a given experiment depends on its final uncertainty budget. We therefore predict that the importance of Bayesian component separation techniques is likely to increase with time for intensity mapping experiments, similar to what has happened in the CMB field, as observational techniques mature, and their overall sensitivity improves.
Bayesian Modelling of Functional Whole Brain Connectivity
DEFF Research Database (Denmark)
Røge, Rasmus
the prevalent strategy of standardizing of fMRI time series and model data using directional statistics or we model the variability in the signal across the brain and across multiple subjects. In either case, we use Bayesian nonparametric modeling to automatically learn from the fMRI data the number......This thesis deals with parcellation of whole-brain functional magnetic resonance imaging (fMRI) using Bayesian inference with mixture models tailored to the fMRI data. In the three included papers and manuscripts, we analyze two different approaches to modeling fMRI signal; either we accept...... of funcional units, i.e. parcels. We benchmark the proposed mixture models against state of the art methods of brain parcellation, both probabilistic and non-probabilistic. The time series of each voxel are most often standardized using z-scoring which projects the time series data onto a hypersphere...
Machine learning a Bayesian and optimization perspective
Theodoridis, Sergios
2015-01-01
This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches, which rely on optimization techniques, as well as Bayesian inference, which is based on a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts. The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as shor...
Bayesian calibration : past achievements and future challenges
International Nuclear Information System (INIS)
Christen, J.A.
2001-01-01
Due to variations of the radiocarbon content in the biosphere over time, radiocarbon determinations need to be calibrated to obtain calendar years. Over the past decade a series of researchers have investigated the possibility of using Bayesian statistics to calibrate radiocarbon determinations, the main feature being the inclusion of contextual information into the calibration process. This allows for a coherent calibration of groups of determinations arising from related contexts (stratigraphical layers, peat cores, cultural events, ect.). Moreover, the 'related contexts' are also dated, and not only the material radiocarbon dated itself. We review Bayesian Calibration and state some of its current challenges like: software development, prior specification, robustness, etc. (author). 14 refs., 4 figs
Disentangling Complexity in Bayesian Automatic Adaptive Quadrature
Adam, Gheorghe; Adam, Sanda
2018-02-01
The paper describes a Bayesian automatic adaptive quadrature (BAAQ) solution for numerical integration which is simultaneously robust, reliable, and efficient. Detailed discussion is provided of three main factors which contribute to the enhancement of these features: (1) refinement of the m-panel automatic adaptive scheme through the use of integration-domain-length-scale-adapted quadrature sums; (2) fast early problem complexity assessment - enables the non-transitive choice among three execution paths: (i) immediate termination (exceptional cases); (ii) pessimistic - involves time and resource consuming Bayesian inference resulting in radical reformulation of the problem to be solved; (iii) optimistic - asks exclusively for subrange subdivision by bisection; (3) use of the weaker accuracy target from the two possible ones (the input accuracy specifications and the intrinsic integrand properties respectively) - results in maximum possible solution accuracy under minimum possible computing time.
Bayesian network modelling of upper gastrointestinal bleeding
Aisha, Nazziwa; Shohaimi, Shamarina; Adam, Mohd Bakri
2013-09-01
Bayesian networks are graphical probabilistic models that represent causal and other relationships between domain variables. In the context of medical decision making, these models have been explored to help in medical diagnosis and prognosis. In this paper, we discuss the Bayesian network formalism in building medical support systems and we learn a tree augmented naive Bayes Network (TAN) from gastrointestinal bleeding data. The accuracy of the TAN in classifying the source of gastrointestinal bleeding into upper or lower source is obtained. The TAN achieves a high classification accuracy of 86% and an area under curve of 92%. A sensitivity analysis of the model shows relatively high levels of entropy reduction for color of the stool, history of gastrointestinal bleeding, consistency and the ratio of blood urea nitrogen to creatinine. The TAN facilitates the identification of the source of GIB and requires further validation.
Bayesian Prior Probability Distributions for Internal Dosimetry
Energy Technology Data Exchange (ETDEWEB)
Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E
2001-07-01
The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)
Probabilistic forecasting and Bayesian data assimilation
Reich, Sebastian
2015-01-01
In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...
Bayesian networks and boundedly rational expectations
Ran Spiegler
2014-01-01
I present a framework for analyzing decision makers with an imperfect understanding of their environment's correlation structure. The framework borrows the tool of "Bayesian networks", which is ubiquitous in statistics and artificial intelligence. In the model, a decision maker faces an objective multivariate probability distribution (his own action is one of the random variables). He is characterized by a directed acyclic graph over the set of random variables. His subjective belief filters ...
Approximation of Bayesian Inverse Problems for PDEs
Cotter, S. L.; Dashti, M.; Stuart, A. M.
2010-01-01
Inverse problems are often ill posed, with solutions that depend sensitively on data.n any numerical approach to the solution of such problems, regularization of some form is needed to counteract the resulting instability. This paper is based on an approach to regularization, employing a Bayesian formulation of the problem, which leads to a notion of well posedness for inverse problems, at the level of probability measures. The stability which results from this well posedness may be used as t...
Multilevel Monte Carlo in Approximate Bayesian Computation
Jasra, Ajay
2017-02-13
In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.
Centralized Bayesian reliability modelling with sensor networks
Czech Academy of Sciences Publication Activity Database
Dedecius, Kamil; Sečkárová, Vladimíra
2013-01-01
Roč. 19, č. 5 (2013), s. 471-482 ISSN 1387-3954 R&D Projects: GA MŠk 7D12004 Grant - others:GA MŠk(CZ) SVV-265315 Keywords : Bayesian modelling * Sensor network * Reliability Subject RIV: BD - Theory of Information Impact factor: 0.984, year: 2013 http://library.utia.cas.cz/separaty/2013/AS/dedecius-0392551.pdf
Essays on portfolio choice with Bayesian methods
Kebabci, Deniz
2007-01-01
How investors should allocate assets to their portfolios in the presence of predictable components in asset returns is a question of great importance in finance. While early studies took the return generating process as given, recent studies have addressed issues such as parameter estimation and model uncertainty. My dissertation develops Bayesian methods for portfolio choice - and industry allocation in particular - under parameter and model uncertainty. The first chapter of my dissertation,...
A theory of Bayesian decision making
Karni, Edi
2009-01-01
This paper presents a complete, choice-based, axiomatic Bayesian decision theory. It introduces a new choice set consisting of information-contingent plans for choosing actions and bets and subjective expected utility model with effect-dependent utility functions and action-dependent subjective probabilities which, in conjunction with the updating of the probabilities using Bayes’ rule, gives rise to a unique prior and a set of action-dependent posterior probabilities representing the decisio...
Constrained bayesian inference of project performance models
Sunmola, Funlade
2013-01-01
Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...
Bayesian Estimation and Inference using Stochastic Hardware
Directory of Open Access Journals (Sweden)
Chetan Singh Thakur
2016-03-01
Full Text Available In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker, demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND, we show how inference can be performed in a Directed Acyclic Graph (DAG using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.
Bayesian Estimation and Inference Using Stochastic Electronics.
Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André
2016-01-01
In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.
Virtual Vector Machine for Bayesian Online Classification
Minka, Thomas P.; Xiang, Rongjing; Yuan; Qi
2012-01-01
In a typical online learning scenario, a learner is required to process a large data stream using a small memory buffer. Such a requirement is usually in conflict with a learner's primary pursuit of prediction accuracy. To address this dilemma, we introduce a novel Bayesian online classi cation algorithm, called the Virtual Vector Machine. The virtual vector machine allows you to smoothly trade-off prediction accuracy with memory size. The virtual vector machine summarizes the information con...
Characteristic imsets for learning Bayesian network structure
Czech Academy of Sciences Publication Activity Database
Hemmecke, R.; Lindner, S.; Studený, Milan
2012-01-01
Roč. 53, č. 9 (2012), s. 1336-1349 ISSN 0888-613X R&D Projects: GA MŠk(CZ) 1M0572; GA ČR GA201/08/0539 Institutional support: RVO:67985556 Keywords : learning Bayesian network structure * essential graph * standard imset * characteristic imset * LP relaxation of a polytope Subject RIV: BA - General Mathematics Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/studeny-0382596.pdf
Bayesian analysis of Markov point processes
DEFF Research Database (Denmark)
Berthelsen, Kasper Klitgaard; Møller, Jesper
2006-01-01
Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....
Decisions under uncertainty using Bayesian analysis
Directory of Open Access Journals (Sweden)
Stelian STANCU
2006-01-01
Full Text Available The present paper makes a short presentation of the Bayesian decions method, where extrainformation brings a great support to decision making process, but also attract new costs. In this situation, getting new information, generally experimentaly based, contributes to diminushing the uncertainty degree that influences decision making process. As a conclusion, in a large number of decision problems, there is the possibility that the decision makers will renew some decisions already taken because of the facilities offered by obtainig extrainformation.
Bayesian evidence and predictivity of the inflationary paradigm
Energy Technology Data Exchange (ETDEWEB)
Gubitosi, Giulia; Lagos, Macarena; Magueijo, João [Theoretical Physics, Blackett Laboratory, Imperial College, London, SW7 2BZ (United Kingdom); Allison, Rupert, E-mail: g.gubitosi@imperial.ac.uk, E-mail: m.lagos13@imperial.ac.uk, E-mail: j.magueijo@imperial.ac.uk, E-mail: rupert.allison@astro.ox.ac.uk [Astrophysics, University of Oxford, DWB, Keble Road, Oxford OX1 3RH (United Kingdom)
2016-06-01
In this paper we consider the issue of paradigm evaluation by applying Bayes' theorem along the following nested hierarchy of progressively more complex structures: i) parameter estimation (within a model), ii) model selection and comparison (within a paradigm), iii) paradigm evaluation. In such a hierarchy the Bayesian evidence works both as the posterior's normalization at a given level and as the likelihood function at the next level up. Whilst raising no objections to the standard application of the procedure at the two lowest levels, we argue that it should receive a considerable modification when evaluating paradigms, when testability and fitting data are equally important. By considering toy models we illustrate how models and paradigms that are difficult to falsify are always favoured by the Bayes factor. We argue that the evidence for a paradigm should not only be high for a given dataset, but exceptional with respect to what it would have been, had the data been different. With this motivation we propose a measure which we term predictivity , as well as a prior to be incorporated into the Bayesian framework, penalising unpredictivity as much as not fitting data. We apply this measure to inflation seen as a whole, and to a scenario where a specific inflationary model is hypothetically deemed as the only one viable as a result of information alien to cosmology (e.g. Solar System gravity experiments, or particle physics input). We conclude that cosmic inflation is currently hard to falsify, but that this could change were external/additional information to cosmology to select one of its many models. We also compare this state of affairs to bimetric varying speed of light cosmology.
Network structure exploration via Bayesian nonparametric models
International Nuclear Information System (INIS)
Chen, Y; Wang, X L; Xiang, X; Tang, B Z; Bu, J Z
2015-01-01
Complex networks provide a powerful mathematical representation of complex systems in nature and society. To understand complex networks, it is crucial to explore their internal structures, also called structural regularities. The task of network structure exploration is to determine how many groups there are in a complex network and how to group the nodes of the network. Most existing structure exploration methods need to specify either a group number or a certain type of structure when they are applied to a network. In the real world, however, the group number and also the certain type of structure that a network has are usually unknown in advance. To explore structural regularities in complex networks automatically, without any prior knowledge of the group number or the certain type of structure, we extend a probabilistic mixture model that can handle networks with any type of structure but needs to specify a group number using Bayesian nonparametric theory. We also propose a novel Bayesian nonparametric model, called the Bayesian nonparametric mixture (BNPM) model. Experiments conducted on a large number of networks with different structures show that the BNPM model is able to explore structural regularities in networks automatically with a stable, state-of-the-art performance. (paper)
Bayesian Recurrent Neural Network for Language Modeling.
Chien, Jen-Tzung; Ku, Yuan-Chu
2016-02-01
A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.
Bayesian Analysis of Individual Level Personality Dynamics
Directory of Open Access Journals (Sweden)
Edward Cripps
2016-07-01
Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively ﬁxed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the beneﬁts of Bayesian techniques for the analysis of within-person processes. These include more formal speciﬁcation of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.
Particle identification in ALICE: a Bayesian approach
Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Diozcora Vargas Trevino, Aurora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym
2016-05-25
We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...
Discriminative Bayesian Dictionary Learning for Classification.
Akhtar, Naveed; Shafait, Faisal; Mian, Ajmal
2016-12-01
We propose a Bayesian approach to learn discriminative dictionaries for sparse representation of data. The proposed approach infers probability distributions over the atoms of a discriminative dictionary using a finite approximation of Beta Process. It also computes sets of Bernoulli distributions that associate class labels to the learned dictionary atoms. This association signifies the selection probabilities of the dictionary atoms in the expansion of class-specific data. Furthermore, the non-parametric character of the proposed approach allows it to infer the correct size of the dictionary. We exploit the aforementioned Bernoulli distributions in separately learning a linear classifier. The classifier uses the same hierarchical Bayesian model as the dictionary, which we present along the analytical inference solution for Gibbs sampling. For classification, a test instance is first sparsely encoded over the learned dictionary and the codes are fed to the classifier. We performed experiments for face and action recognition; and object and scene-category classification using five public datasets and compared the results with state-of-the-art discriminative sparse representation approaches. Experiments show that the proposed Bayesian approach consistently outperforms the existing approaches.
Bayesian Inference of a Multivariate Regression Model
Directory of Open Access Journals (Sweden)
Marick S. Sinay
2014-01-01
Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.
Bayesian posterior distributions without Markov chains.
Cole, Stephen R; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B
2012-03-01
Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976-1983) assessing the relation between residential exposure to magnetic fields and the development of childhood cancer. Results from rejection sampling (odds ratio (OR) = 1.69, 95% posterior interval (PI): 0.57, 5.00) were similar to MCMC results (OR = 1.69, 95% PI: 0.58, 4.95) and approximations from data-augmentation priors (OR = 1.74, 95% PI: 0.60, 5.06). In example 2, the authors apply rejection sampling to a cohort study of 315 human immunodeficiency virus seroconverters (1984-1998) to assess the relation between viral load after infection and 5-year incidence of acquired immunodeficiency syndrome, adjusting for (continuous) age at seroconversion and race. In this more complex example, rejection sampling required a notably longer run time than MCMC sampling but remained feasible and again yielded similar results. The transparency of the proposed approach comes at a price of being less broadly applicable than MCMC.
Bayesian methodology for reliability model acceptance
International Nuclear Information System (INIS)
Zhang Ruoxue; Mahadevan, Sankaran
2003-01-01
This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model
Risk-sensitivity in Bayesian sensorimotor integration.
Directory of Open Access Journals (Sweden)
Jordi Grau-Moya
Full Text Available Information processing in the nervous system during sensorimotor tasks with inherent uncertainty has been shown to be consistent with Bayesian integration. Bayes optimal decision-makers are, however, risk-neutral in the sense that they weigh all possibilities based on prior expectation and sensory evidence when they choose the action with highest expected value. In contrast, risk-sensitive decision-makers are sensitive to model uncertainty and bias their decision-making processes when they do inference over unobserved variables. In particular, they allow deviations from their probabilistic model in cases where this model makes imprecise predictions. Here we test for risk-sensitivity in a sensorimotor integration task where subjects exhibit Bayesian information integration when they infer the position of a target from noisy sensory feedback. When introducing a cost associated with subjects' response, we found that subjects exhibited a characteristic bias towards low cost responses when their uncertainty was high. This result is in accordance with risk-sensitive decision-making processes that allow for deviations from Bayes optimal decision-making in the face of uncertainty. Our results suggest that both Bayesian integration and risk-sensitivity are important factors to understand sensorimotor integration in a quantitative fashion.
Bayesian outcome-based strategy classification.
Lee, Michael D
2016-03-01
Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.
Bayesian Methods for Radiation Detection and Dosimetry
International Nuclear Information System (INIS)
Peter G. Groer
2002-01-01
We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed compartmental activities. From the estimated probability densities of the model parameters we were able to derive the densities for compartmental activities for a two compartment catenary model at different times. We also calculated the average activities and their standard deviation for a simple two compartment model
International Nuclear Information System (INIS)
Kunkler, I.H.; Price, A.; Dixon, M.; Canney, P.; Prescott, R.; Sainsbury, R.; Aird, E.
2003-01-01
Danish and Canadian randomised trials of postmastectomy radiotherapy (PMRT) have shown the importance of loco-regional control to survival in 'high risk' pre and postmenopausal women receiving adjuvant systemic therapy. The effects of radiotherapy (RT) in terms of improving survival are similar to those of systemic therapy. International consensus now supports the use of postmastectomy chest wall irradiation in women with 4 or more involved axillary nodes or primary tumour size=/> 5cm. The role of PMRT in women at intermediate risk' with 1-3 involved nodes or node negative with other risk factors is controversial. The absolute reduction in risk of loco-regional recurrence varies widely (3-23%) in trials of PMRT in women with 1-3 involved nodes receiving systemic therapy. A UK survey of clinical oncologists (Kunkler et al,The Breast 1999;8:235) showed wide variations in opinion on the use of radiotherapy in these subgroups. It is possible that while RT may confer most benefit in loco-regional control, a greater survival benefit might accrue in patients with smaller tumours and fewer involved nodes. The 2000 Oxford overview of randomised trials of postoperative RT identifies non breast cancer deaths from RT related vascular morbidity as counterbalancing the benefits of RT in reducing breast cancer mortality. With the more extensive use of potentially cardiotoxic anthracycline containing adjuvant systemic therapy there are concerns about greater cardiac morbidity in patients receiving PMRT in addition. A large randomised international trial (SUPREMO) is proposed to recruit 3500 patients with (a) 1-3 involved axillary nodes or (b) node negative with other risk factors (grade 3 or lymphovascular invasion) treated by mastectomy, axillary clearance and appropriate systemic therapy for T0-3,N0-1,MO breast cancer. The primary endpoint is overall survival. Secondary endpoints are disease free survival, quality of life, morbidity (including cardiac), cost per life year saved
Non-parametric Bayesian models of response function in dynamic image sequences
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej; Šmídl, Václav
2016-01-01
Roč. 151, č. 1 (2016), s. 90-100 ISSN 1077-3142 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Response function * Blind source separation * Dynamic medical imaging * Probabilistic models * Bayesian methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.498, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/tichy-0456983.pdf