WorldWideScience

Sample records for bayesian receiver operating

  1. Bayesian receiver operating characteristic estimation of multiple tests for diagnosis of bovine tuberculosis in Chadian cattle.

    Directory of Open Access Journals (Sweden)

    Borna Müller

    Full Text Available BACKGROUND: Bovine tuberculosis (BTB today primarily affects developing countries. In Africa, the disease is present essentially on the whole continent; however, little accurate information on its distribution and prevalence is available. Also, attempts to evaluate diagnostic tests for BTB in naturally infected cattle are scarce and mostly complicated by the absence of knowledge of the true disease status of the tested animals. However, diagnostic test evaluation in a given setting is a prerequisite for the implementation of local surveillance schemes and control measures. METHODOLOGY/PRINCIPAL FINDINGS: We subjected a slaughterhouse population of 954 Chadian cattle to single intra-dermal comparative cervical tuberculin (SICCT testing and two recently developed fluorescence polarization assays (FPA. Using a Bayesian modeling approach we computed the receiver operating characteristic (ROC curve of each diagnostic test, the true disease prevalence in the sampled population and the disease status of all sampled animals in the absence of knowledge of the true disease status of the sampled animals. In our Chadian setting, SICCT performed better if the cut-off for positive test interpretation was lowered from >4 mm (OIE standard cut-off to >2 mm. Using this cut-off, SICCT showed a sensitivity and specificity of 66% and 89%, respectively. Both FPA tests showed sensitivities below 50% but specificities above 90%. The true disease prevalence was estimated at 8%. Altogether, 11% of the sampled animals showed gross visible tuberculous lesions. However, modeling of the BTB disease status of the sampled animals indicated that 72% of the suspected tuberculosis lesions detected during standard meat inspections were due to other pathogens than Mycobacterium bovis. CONCLUSIONS/SIGNIFICANCE: Our results have important implications for BTB diagnosis in a high incidence sub-Saharan African setting and demonstrate the practicability of our Bayesian approach for

  2. Predictors of Outcome in Traumatic Brain Injury: New Insight Using Receiver Operating Curve Indices and Bayesian Network Analysis.

    Directory of Open Access Journals (Sweden)

    Zsolt Zador

    Full Text Available Traumatic brain injury remains a global health problem. Understanding the relative importance of outcome predictors helps optimize our treatment strategies by informing assessment protocols, clinical decisions and trial designs. In this study we establish importance ranking for outcome predictors based on receiver operating indices to identify key predictors of outcome and create simple predictive models. We then explore the associations between key outcome predictors using Bayesian networks to gain further insight into predictor importance.We analyzed the corticosteroid randomization after significant head injury (CRASH trial database of 10008 patients and included patients for whom demographics, injury characteristics, computer tomography (CT findings and Glasgow Outcome Scale (GCS were recorded (total of 13 predictors, which would be available to clinicians within a few hours following the injury in 6945 patients. Predictions of clinical outcome (death or severe disability at 6 months were performed using logistic regression models with 5-fold cross validation. Predictive performance was measured using standardized partial area (pAUC under the receiver operating curve (ROC and we used Delong test for comparisons. Variable importance ranking was based on pAUC targeted at specificity (pAUCSP and sensitivity (pAUCSE intervals of 90-100%. Probabilistic associations were depicted using Bayesian networks.Complete AUC analysis showed very good predictive power (AUC = 0.8237, 95% CI: 0.8138-0.8336 for the complete model. Specificity focused importance ranking highlighted age, pupillary, motor responses, obliteration of basal cisterns/3rd ventricle and midline shift. Interestingly when targeting model sensitivity, the highest-ranking variables were age, severe extracranial injury, verbal response, hematoma on CT and motor response. Simplified models, which included only these key predictors, had similar performance (pAUCSP = 0.6523, 95% CI: 0

  3. Unknown Quantum States and Operations, a Bayesian View

    CERN Document Server

    Fuchs, C; Fuchs, Christopher A.; Schack, Ruediger

    2004-01-01

    The classical de Finetti theorem provides an operational definition of the concept of an unknown probability in Bayesian probability theory, where probabilities are taken to be degrees of belief instead of objective states of nature. In this paper, we motivate and review two results that generalize de Finetti's theorem to the quantum mechanical setting: Namely a de Finetti theorem for quantum states and a de Finetti theorem for quantum operations. The quantum-state theorem, in a closely analogous fashion to the original de Finetti theorem, deals with exchangeable density-operator assignments and provides an operational definition of the concept of an "unknown quantum state" in quantum-state tomography. Similarly, the quantum-operation theorem gives an operational definition of an "unknown quantum operation" in quantum-process tomography. These results are especially important for a Bayesian interpretation of quantum mechanics, where quantum states and (at least some) quantum operations are taken to be states ...

  4. An Efficient Two-Fold Marginalized Bayesian Filter for Multipath Estimation in Satellite Navigation Receivers

    Directory of Open Access Journals (Sweden)

    Robertson Patrick

    2010-01-01

    Full Text Available Multipath is today still one of the most critical problems in satellite navigation, in particular in urban environments, where the received navigation signals can be affected by blockage, shadowing, and multipath reception. Latest multipath mitigation algorithms are based on the concept of sequential Bayesian estimation and improve the receiver performance by exploiting the temporal constraints of the channel dynamics. In this paper, we specifically address the problem of estimating and adjusting the number of multipath replicas that is considered by the receiver algorithm. An efficient implementation via a two-fold marginalized Bayesian filter is presented, in which a particle filter, grid-based filters, and Kalman filters are suitably combined in order to mitigate the multipath channel by efficiently estimating its time-variant parameters in a track-before-detect fashion. Results based on an experimentally derived set of channel data corresponding to a typical urban propagation environment are used to confirm the benefit of our novel approach.

  5. Operational modal analysis modeling, Bayesian inference, uncertainty laws

    CERN Document Server

    Au, Siu-Kui

    2017-01-01

    This book presents operational modal analysis (OMA), employing a coherent and comprehensive Bayesian framework for modal identification and covering stochastic modeling, theoretical formulations, computational algorithms, and practical applications. Mathematical similarities and philosophical differences between Bayesian and classical statistical approaches to system identification are discussed, allowing their mathematical tools to be shared and their results correctly interpreted. Many chapters can be used as lecture notes for the general topic they cover beyond the OMA context. After an introductory chapter (1), Chapters 2–7 present the general theory of stochastic modeling and analysis of ambient vibrations. Readers are first introduced to the spectral analysis of deterministic time series (2) and structural dynamics (3), which do not require the use of probability concepts. The concepts and techniques in these chapters are subsequently extended to a probabilistic context in Chapter 4 (on stochastic pro...

  6. Bayesian Recovery of Clipped OFDM Signals: A Receiver-based Approach

    KAUST Repository

    Al-Rabah, Abdullatif R.

    2013-05-01

    Recently, orthogonal frequency-division multiplexing (OFDM) has been adopted for high-speed wireless communications due to its robustness against multipath fading. However, one of the main fundamental drawbacks of OFDM systems is the high peak-to-average-power ratio (PAPR). Several techniques have been proposed for PAPR reduction. Most of these techniques require transmitter-based (pre-compensated) processing. On the other hand, receiver-based alternatives would save the power and reduce the transmitter complexity. By keeping this in mind, a possible approach is to limit the amplitude of the OFDM signal to a predetermined threshold and equivalently a sparse clipping signal is added. Then, estimating this clipping signal at the receiver to recover the original signal. In this work, we propose a Bayesian receiver-based low-complexity clipping signal recovery method for PAPR reduction. The method is able to i) effectively reduce the PAPR via simple clipping scheme at the transmitter side, ii) use Bayesian recovery algorithm to reconstruct the clipping signal at the receiver side by measuring part of subcarriers, iii) perform well in the absence of statistical information about the signal (e.g. clipping level) and the noise (e.g. noise variance), and at the same time iv is energy efficient due to its low complexity. Specifically, the proposed recovery technique is implemented in data-aided based. The data-aided method collects clipping information by measuring reliable 
data subcarriers, thus makes full use of spectrum for data transmission without the need for tone reservation. The study is extended further to discuss how to improve the recovery of the clipping signal utilizing some features of practical OFDM systems i.e., the oversampling and the presence of multiple receivers. Simulation results demonstrate the superiority of the proposed technique over other recovery algorithms. The overall objective is to show that the receiver-based Bayesian technique is highly

  7. OFDM receiver for fast time-varying channels using block-sparse Bayesian learning

    DEFF Research Database (Denmark)

    Barbu, Oana-Elena; Manchón, Carles Navarro; Rom, Christian;

    2016-01-01

    We propose an iterative algorithm for OFDM receivers operating over fast time-varying channels. The design relies on the assumptions that the channel response can be characterized by a few non-negligible separable multipath components, and the temporal variation of each component gain can be well...... inference, we embed the channel estimator in a receiver structure that performs iterative channel and noise precision estimation, intercarrier interference cancellation, detection and decoding. Simulation results illustrate the superior performance of the proposed receiver over state-of-art receivers....

  8. Extension of Boolean algebra by a Bayesian operator: application to the definition of a Deterministic Bayesian Logic

    CERN Document Server

    Dambreville, Frederic

    2011-01-01

    This work contributes to the domains of Boolean algebra and of Bayesian probability, by proposing an algebraic extension of Boolean algebras, which implements an operator for the Bayesian conditional inference and is closed under this operator. It is known since the work of Lewis (Lewis' triviality) that it is not possible to construct such conditional operator within the space of events. Nevertheless, this work proposes an answer which complements Lewis' triviality, by the construction of a conditional operator outside the space of events, thus resulting in an algebraic extension. In particular, it is proved that any probability defined on a Boolean algebra may be extended to its algebraic extension in compliance with the multiplicative definition of the conditional probability. In the last part of this paper, a new \\emph{bivalent} logic is introduced on the basis of this algebraic extension, and basic properties are derived.

  9. Receiver-based recovery of clipped ofdm signals for papr reduction: A bayesian approach

    KAUST Repository

    Ali, Anum

    2014-01-01

    Clipping is one of the simplest peak-to-average power ratio reduction schemes for orthogonal frequency division multiplexing (OFDM). Deliberately clipping the transmission signal degrades system performance, and clipping mitigation is required at the receiver for information restoration. In this paper, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the sparsity rate and noise variance for enhanced recovery. At the same time, the proposed scheme is robust against inaccurate estimates of the clipping signal statistics. The undistorted phase property of the clipped signal, as well as the clipping likelihood, is utilized for enhanced reconstruction. Furthermore, motivated by the nature of modern OFDM-based communication systems, we extend our clipping reconstruction approach to multiple antenna receivers and multi-user OFDM.We also address the problem of channel estimation from pilots contaminated by the clipping distortion. Numerical findings are presented that depict favorable results for the proposed scheme compared to the established sparse reconstruction schemes.

  10. Application of Bayesian networks in quantitative risk assessment of subsea blowout preventer operations.

    Science.gov (United States)

    Cai, Baoping; Liu, Yonghong; Liu, Zengkai; Tian, Xiaojie; Zhang, Yanzhen; Ji, Renjie

    2013-07-01

    This article proposes a methodology for the application of Bayesian networks in conducting quantitative risk assessment of operations in offshore oil and gas industry. The method involves translating a flow chart of operations into the Bayesian network directly. The proposed methodology consists of five steps. First, the flow chart is translated into a Bayesian network. Second, the influencing factors of the network nodes are classified. Third, the Bayesian network for each factor is established. Fourth, the entire Bayesian network model is established. Lastly, the Bayesian network model is analyzed. Subsequently, five categories of influencing factors, namely, human, hardware, software, mechanical, and hydraulic, are modeled and then added to the main Bayesian network. The methodology is demonstrated through the evaluation of a case study that shows the probability of failure on demand in closing subsea ram blowout preventer operations. The results show that mechanical and hydraulic factors have the most important effects on operation safety. Software and hardware factors have almost no influence, whereas human factors are in between. The results of the sensitivity analysis agree with the findings of the quantitative analysis. The three-axiom-based analysis partially validates the correctness and rationality of the proposed Bayesian network model.

  11. Bayesian analysis of risk associated with workplace accidents in earthmoving operations

    Directory of Open Access Journals (Sweden)

    J. F. García

    2017-06-01

    Full Text Available This paper analyses the characteristics of earthmoving operations involving a workplace accident. Bayesian networks were used to identify the factors that best predicted potential risk situations. Inference studies were then conducted to analyse the interplay between different risk factors. We demonstrate the potential of Bayesian networks to describe workplace contexts and predict risk situations from a safety and production planning perspective.

  12. OFDM receiver for fast time-varying channels using block-sparse Bayesian learning

    DEFF Research Database (Denmark)

    Barbu, Oana-Elena; Manchón, Carles Navarro; Rom, Christian

    2016-01-01

    We propose an iterative algorithm for OFDM receivers operating over fast time-varying channels. The design relies on the assumptions that the channel response can be characterized by a few non-negligible separable multipath components, and the temporal variation of each component gain can be well...

  13. Insights on the Bayesian spectral density method for operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui

    2016-01-01

    This paper presents a study on the Bayesian spectral density method for operational modal analysis. The method makes Bayesian inference of the modal properties by using the sample power spectral density (PSD) matrix averaged over independent sets of ambient data. In the typical case with a single set of data, it is divided into non-overlapping segments and they are assumed to be independent. This study is motivated by a recent paper that reveals a mathematical equivalence of the method with the Bayesian FFT method. The latter does not require averaging concepts or the independent segment assumption. This study shows that the equivalence does not hold in reality because the theoretical long data asymptotic distribution of the PSD matrix may not be valid. A single time history can be considered long for the Bayesian FFT method but not necessarily for the Bayesian PSD method, depending on the number of segments.

  14. Risk-Based Operation and Maintenance Using Bayesian Networks

    DEFF Research Database (Denmark)

    Nielsen, Jannie Jessen; Sørensen, John Dalsgaard

    2011-01-01

    This paper describes how risk-based decision making can be used for maintenance planning of components exposed to degradation such as fatigue in offshore wind turbines. In fatigue models, large epistemic uncertainties are usually present. These can be reduced if monitoring results are used to upd...... to update the models, and hereby a better basis for decision making is obtained. An application example shows how a Bayesian network model can be used as a tool for updating the model and assist in risk-based decision making....

  15. Dynamic Bayesian modeling for risk prediction in credit operations

    DEFF Research Database (Denmark)

    Borchani, Hanen; Martinez, Ana Maria; Masegosa, Andres

    2015-01-01

    Our goal is to do risk prediction in credit operations, and as data is collected continuously and reported on a monthly basis, this gives rise to a streaming data classification problem. Our analysis reveals some practical problems that have not previously been thoroughly analyzed in the context...... of streaming data analysis: the class labels are not immediately available and the relevant predictive features and entities under study (in this case the set of customers) may vary over time. In order to address these problems, we propose to use a dynamic classifier with a wrapper feature subset selection...

  16. Planetary micro-rover operations on Mars using a Bayesian framework for inference and control

    Science.gov (United States)

    Post, Mark A.; Li, Junquan; Quine, Brendan M.

    2016-03-01

    With the recent progress toward the application of commercially-available hardware to small-scale space missions, it is now becoming feasible for groups of small, efficient robots based on low-power embedded hardware to perform simple tasks on other planets in the place of large-scale, heavy and expensive robots. In this paper, we describe design and programming of the Beaver micro-rover developed for Northern Light, a Canadian initiative to send a small lander and rover to Mars to study the Martian surface and subsurface. For a small, hardware-limited rover to handle an uncertain and mostly unknown environment without constant management by human operators, we use a Bayesian network of discrete random variables as an abstraction of expert knowledge about the rover and its environment, and inference operations for control. A framework for efficient construction and inference into a Bayesian network using only the C language and fixed-point mathematics on embedded hardware has been developed for the Beaver to make intelligent decisions with minimal sensor data. We study the performance of the Beaver as it probabilistically maps a simple outdoor environment with sensor models that include uncertainty. Results indicate that the Beaver and other small and simple robotic platforms can make use of a Bayesian network to make intelligent decisions in uncertain planetary environments.

  17. Using GOMS and Bayesian plan recognition to develop recognition models of operator behavior

    Science.gov (United States)

    Zaientz, Jack D.; DeKoven, Elyon; Piegdon, Nicholas; Wood, Scott D.; Huber, Marcus J.

    2006-05-01

    Trends in combat technology research point to an increasing role for uninhabited vehicles in modern warfare tactics. To support increased span of control over these vehicles human responsibilities need to be transformed from tedious, error-prone and cognition intensive operations into tasks that are more supervisory and manageable, even under intensely stressful conditions. The goal is to move away from only supporting human command of low-level system functions to intention-level human-system dialogue about the operator's tasks and situation. A critical element of this process is developing the means to identify when human operators need automated assistance and to identify what assistance they need. Toward this goal, we are developing an unmanned vehicle operator task recognition system that combines work in human behavior modeling and Bayesian plan recognition. Traditionally, human behavior models have been considered generative, meaning they describe all possible valid behaviors. Basing behavior recognition on models designed for behavior generation can offers advantages in improved model fidelity and reuse. It is not clear, however, how to reconcile the structural differences between behavior recognition and behavior modeling approaches. Our current work demonstrates that by pairing a cognitive psychology derived human behavior modeling approach, GOMS, with a Bayesian plan recognition engine, ASPRN, we can translate a behavior generation model into a recognition model. We will discuss the implications for using human performance models in this manner as well as suggest how this kind of modeling may be used to support the real-time control of multiple, uninhabited battlefield vehicles and other semi-autonomous systems.

  18. Bayesian optimization analysis of containment-venting operation in a boiling water reactor severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Xiaoyu; Ishikawa, Jun; Sugiyama, Tomoyuki; Maryyama, Yu [Nuclear Safety Research Center, Japan Atomic Energy Agency, Ibaraki (Japan)

    2017-03-15

    Containment venting is one of several essential measures to protect the integrity of the final barrier of a nuclear reactor during severe accidents, by which the uncontrollable release of fission products can be avoided. The authors seek to develop an optimization approach to venting operations, from a simulation-based perspective, using an integrated severe accident code, THALES2/KICHE. The effectiveness of the containment-venting strategies needs to be verified via numerical simulations based on various settings of the venting conditions. The number of iterations, however, needs to be controlled to avoid cumbersome computational burden of integrated codes. Bayesian optimization is an efficient global optimization approach. By using a Gaussian process regression, a surrogate model of the “black-box” code is constructed. It can be updated simultaneously whenever new simulation results are acquired. With predictions via the surrogate model, upcoming locations of the most probable optimum can be revealed. The sampling procedure is adaptive. Compared with the case of pure random searches, the number of code queries is largely reduced for the optimum finding. One typical severe accident scenario of a boiling water reactor is chosen as an example. The research demonstrates the applicability of the Bayesian optimization approach to the design and establishment of containment-venting strategies during severe accidents.

  19. Partial inversion of elliptic operator to speed up computation of likelihood in Bayesian inference

    KAUST Repository

    Litvinenko, Alexander

    2017-08-09

    In this paper, we speed up the solution of inverse problems in Bayesian settings. By computing the likelihood, the most expensive part of the Bayesian formula, one compares the available measurement data with the simulated data. To get simulated data, repeated solution of the forward problem is required. This could be a great challenge. Often, the available measurement is a functional $F(u)$ of the solution $u$ or a small part of $u$. Typical examples of $F(u)$ are the solution in a point, solution on a coarser grid, in a small subdomain, the mean value in a subdomain. It is a waste of computational resources to evaluate, first, the whole solution and then compute a part of it. In this work, we compute the functional $F(u)$ direct, without computing the full inverse operator and without computing the whole solution $u$. The main ingredients of the developed approach are the hierarchical domain decomposition technique, the finite element method and the Schur complements. To speed up computations and to reduce the storage cost, we approximate the forward operator and the Schur complement in the hierarchical matrix format. Applying the hierarchical matrix technique, we reduced the computing cost to $\\\\mathcal{O}(k^2n \\\\log^2 n)$, where $k\\\\ll n$ and $n$ is the number of degrees of freedom. Up to the $\\\\H$-matrix accuracy, the computation of the functional $F(u)$ is exact. To reduce the computational resources further, we can approximate $F(u)$ on, for instance, multiple coarse meshes. The offered method is well suited for solving multiscale problems. A disadvantage of this method is the assumption that one has to have access to the discretisation and to the procedure of assembling the Galerkin matrix.

  20. 47 CFR 25.220 - Non-conforming transmit/receive earth station operations.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Non-conforming transmit/receive earth station... CARRIER SERVICES SATELLITE COMMUNICATIONS Technical Standards § 25.220 Non-conforming transmit/receive... operator acknowledging that the proposed operation of the subject non-conforming earth station with...

  1. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seongkeun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    The purpose of this paper is to confirm if Bayesian inference can properly reflect the situation awareness of real human operators, and find the difference between the situation of ideal and practical operators, and investigate the factors which contributes to those difference. As a results, human can not think like computer. If human can memorize all the information, and their thinking process is same to the CPU of computer, the results of these two experiments come out more than 99%. However the probability of finding right malfunction by humans are only 64.52% in simple experiment, and 51.61% in complex experiment. Cognition is the mental processing that includes the attention of working memory, comprehending and producing language, calculating, reasoning, problem solving, and decision making. There are many reasons why human thinking process is different with computer, but in this experiment, we suggest that the working memory is the most important factor. Humans have limited working memory which has only seven chunks capacity. These seven chunks are called magic number. If there are more than seven sequential information, people start to forget the previous information because their working memory capacity is running over. We can check how much working memory affects to the result through the simple experiment. Then what if we neglect the effect of working memory? The total number of subjects who have incorrect memory is 7 (subject 3, 5, 6, 7, 8, 15, 25). They could find the right malfunction if the memory hadn't changed because of lack of working memory. Then the probability of find correct malfunction will be increased to 87.10% from 64.52%. Complex experiment has similar result. In this case, eight subjects(1, 5, 8, 9, 15, 17, 18, 30) had changed the memory, and it affects to find the right malfunction. Considering it, then the probability would be (16+8)/31 = 77.42%.

  2. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    Science.gov (United States)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  3. Epistemic-Based Investigation of the Probability of Hazard Scenarios Using Bayesian Network for the Lifting Operation of Floating Objects

    Institute of Scientific and Technical Information of China (English)

    Ahmad Bahoo Toroody; Mohammad Mahdi Abaiee; Reza Gholamnia; Mohammad Javad Ketabdari

    2016-01-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types:the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  4. Bayesian signaling

    OpenAIRE

    Hedlund, Jonas

    2014-01-01

    This paper introduces private sender information into a sender-receiver game of Bayesian persuasion with monotonic sender preferences. I derive properties of increasing differences related to the precision of signals and use these to fully characterize the set of equilibria robust to the intuitive criterion. In particular, all such equilibria are either separating, i.e., the sender's choice of signal reveals his private information to the receiver, or fully disclosing, i.e., the outcome of th...

  5. Highly efficient Bayesian joint inversion for receiver-based data and its application to lithospheric structure beneath the southern Korean Peninsula

    Science.gov (United States)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-07-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimization phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimization to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimization is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimization algorithm. After optimization, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  6. Reducing cost with autonomous operations of the Deep Space Network radio science receiver

    Science.gov (United States)

    Asmar, S.; Anabtawi, A.; Connally, M.; Jongeling, A.

    2003-01-01

    This paper describes the Radio Science Receiver system and the savings it has brought to mission operations. The design and implementation of remote and autonomous operations will be discussed along with the process of including user feedback along the way and lessons learned and procedures avoided.

  7. A brief history of free-response receiver operating characteristic paradigm data analysis.

    Science.gov (United States)

    Chakraborty, Dev P

    2013-07-01

    In the receiver operating characteristic paradigm the observer assigns a single rating to each image and the location of the perceived abnormality, if any, is ignored. In the free-response receiver operating characteristic paradigm the observer is free to mark and rate as many suspicious regions as are considered clinically reportable. Credit for a correct localization is given only if a mark is sufficiently close to an actual lesion; otherwise, the observer's mark is scored as a location-level false positive. Until fairly recently there existed no accepted method for analyzing the resulting relatively unstructured data containing random numbers of mark-rating pairs per image. This report reviews the history of work in this field, which has now spanned more than five decades. It introduces terminology used to describe the paradigm, proposed measures of performance (figures of merit), ways of visualizing the data (operating characteristics), and software for analyzing free-response receiver operating characteristic studies.

  8. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seong Keun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2014-08-15

    Bayesian methodology has been used widely used in various research fields. It is method of inference using Bayes' rule to update the estimation of probability for the certain hypothesis when additional evidences are acquired. According to the current researches, malfunction of nuclear power plant can be detected by using this Bayesian inference which consistently piles up the newly incoming data and updates its estimation. However, those researches are based on the assumption that people are doing like computer perfectly, which can be criticized and may cause a problem in real world application. Studies in cognitive psychology indicates that when the amount of information becomes larger, people can't save the whole data because people have limited memory capacity which is well known as working memory, and also they have attention problem. The purpose of this paper is to consider the psychological factors and confirm how much this working memory and attention will affect the resulted estimation based on the Bayesian inference. To confirm this, experiment on human is needed, and the tool of experiment is Compact Nuclear Simulator (CNS)

  9. Receiver operating characteristics of perceptrons : Influence of sample size and prevalence

    NARCIS (Netherlands)

    Freking, Ansgar; Biehl, Michael; Braun, Christian; Kinzel, Wolfgang; Meesmann, Malte

    1999-01-01

    In many practical classification problems it is important to distinguish false positive from false negative results when evaluating the performance of the classifier. This is of particular importance for medical diagnostic tests. In this context, receiver operating characteristic (ROC) curves have b

  10. Long Length Contaminated Equipment Retrieval System Receiver Trailer and Transport Trailer Operations and Maintenance Manual

    Energy Technology Data Exchange (ETDEWEB)

    DALE, R.N.

    2000-05-01

    A system to accommodate the removal of long-length contaminated equipment (LLCE) from Hanford underground radioactive waste storage tanks was designed, procured, and demonstrated, via a project activity during the 1990s. The system is the Long Length Contaminated Equipment Removal System (LLCERS). LLCERS will be maintained and operated by Tank Farms Engineering and Operations organizations and other varied projects having a need for the system. The responsibility for the operation and maintenance of the LLCERS Receiver Trailer (RT) and Transport Trailer (TT) resides with the RPP Characterization Project Operations organization. The purpose of this document is to provide vendor supplied operating and maintenance (O & M) information for the RT and TT in a readily retrievable form. This information is provided this way instead of in a vendor information (VI) file to maintain configuration control of the operations baseline as described in RPP-6085, ''Configuration Management Plan for Long Length Contaminated Equipment Receiver and Transport Trailers''. Additional Operations Baseline documents are identified in RPP-6085.

  11. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    Science.gov (United States)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  12. Bayesian derivation of plasma equilibrium distribution function for tokamak scenarios and the associated Landau collision operator

    CERN Document Server

    Di Troia, Claudio

    2015-01-01

    A class of parametric distribution functions has been proposed in [C.DiTroia, Plasma Physics and Controlled Fusion,54,2012] as equilibrium distribution functions (EDFs) for charged particles in fusion plasmas, representing supra-thermal particles in anisotropic equilibria for Neutral Beam Injection, Ion Cyclotron Heating scenarios. Moreover, the EDFs can also represent nearly isotropic equilibria for Slowing-Down $alpha$ particles and core thermal plasma populations. These EDFs depend on constants of motion (COMs). Assuming an axisymmetric system with no equilibrium electric field, the EDF depends on the toroidal canonical momentum $P_\\phi$, the kinetic energy $w$ and the magnetic moment \\mu. In the present work, the EDFs are obtained from first principles and general hypothesis. The derivation is probabilistic and makes use of the Bayes' Theorem. The bayesian argument allows us to describe how far from the prior probability distribution function (pdf), e.g. Maxwellian, the plasma is, based on the information...

  13. Receiver Operating Characteristic (ROC) Curve-based Prediction Model for Periodontal Disease Updated With the Calibrated Community Periodontal Index.

    Science.gov (United States)

    Su, Chiu-Wen; Ming-Fang Yen, Amy; Lai, Hongmin; Chen, Hsiu-Hsi; Chen, Sam Li-Sheng

    2017-07-28

    Background The accuracy of a prediction model for periodontal disease using the community periodontal index (CPI) has been undertaken by using an area receiver operating characteristics (AUROC) curve, but how the uncalibrated CPI, as measured by general dentists trained by periodontists in a large epidemiological study, required for constructing a prediction model that affects its performance has not been researched yet. Methods We conducted a two-stage design by first proposing a validation study to calibrate the CPI between a senior periodontal specialist and trained general dentists who measured CPIs in the main study of a nationwide survey. A Bayesian hierarchical logistic regression model was applied to estimate the non-updated and updated clinical weights used for building up risk scores. How the calibrated CPI affected the performance of the updated prediction model was quantified by comparing the AUROC curves between the original and the updated model. Results The estimates regarding the calibration of CPI obtained from the validation study were 66% and 85% for sensitivity and specificity, respectively. After updating, the clinical weights of each predictor were inflated, and the risk score for the highest risk category was elevated from 434 to 630. Such an update improved the AUROC performance of the two corresponding prediction models from 62.6% (95% CI: 61.7%-63.6%) for the non-updated model to 68.9% (95% CI: 68.0%-69.6%) for the updated one, reaching a statistically significant difference (P periodontal disease as measured by the calibrated CPI derived from a large epidemiological survey.

  14. Operator decision support system for integrated wastewater management including wastewater treatment plants and receiving water bodies.

    Science.gov (United States)

    Kim, Minsoo; Kim, Yejin; Kim, Hyosoo; Piao, Wenhua; Kim, Changwon

    2016-06-01

    An operator decision support system (ODSS) is proposed to support operators of wastewater treatment plants (WWTPs) in making appropriate decisions. This system accounts for water quality (WQ) variations in WWTP influent and effluent and in the receiving water body (RWB). The proposed system is comprised of two diagnosis modules, three prediction modules, and a scenario-based supporting module (SSM). In the diagnosis modules, the WQs of the influent and effluent WWTP and of the RWB are assessed via multivariate analysis. Three prediction modules based on the k-nearest neighbors (k-NN) method, activated sludge model no. 2d (ASM2d) model, and QUAL2E model are used to forecast WQs for 3 days in advance. To compare various operating alternatives, SSM is applied to test various predetermined operating conditions in terms of overall oxygen transfer coefficient (Kla), waste sludge flow rate (Qw), return sludge flow rate (Qr), and internal recycle flow rate (Qir). In the case of unacceptable total phosphorus (TP), SSM provides appropriate information for the chemical treatment. The constructed ODSS was tested using data collected from Geumho River, which was the RWB, and S WWTP in Daegu City, South Korea. The results demonstrate the capability of the proposed ODSS to provide WWTP operators with more objective qualitative and quantitative assessments of WWTP and RWB WQs. Moreover, the current study shows that ODSS, using data collected from the study area, can be used to identify operational alternatives through SSM at an integrated urban wastewater management level.

  15. Application of Bayesian least absolute shrinkage and selection operator (LASSO) and BayesCπ methods for genomic selection in French Holstein and Montbéliarde breeds.

    Science.gov (United States)

    Colombani, C; Legarra, A; Fritz, S; Guillaume, F; Croiseau, P; Ducrocq, V; Robert-Granié, C

    2013-01-01

    Recently, the amount of available single nucleotide polymorphism (SNP) marker data has considerably increased in dairy cattle breeds, both for research purposes and for application in commercial breeding and selection programs. Bayesian methods are currently used in the genomic evaluation of dairy cattle to handle very large sets of explanatory variables with a limited number of observations. In this study, we applied 2 bayesian methods, BayesCπ and bayesian least absolute shrinkage and selection operator (LASSO), to 2 genotyped and phenotyped reference populations consisting of 3,940 Holstein bulls and 1,172 Montbéliarde bulls with approximately 40,000 polymorphic SNP. We compared the accuracy of the bayesian methods for the prediction of 3 traits (milk yield, fat content, and conception rate) with pedigree-based BLUP, genomic BLUP, partial least squares (PLS) regression, and sparse PLS regression, a variable selection PLS variant. The results showed that the correlations between observed and predicted phenotypes were similar in BayesCπ (including or not pedigree information) and bayesian LASSO for most of the traits and whatever the breed. In the Holstein breed, bayesian methods led to higher correlations than other approaches for fat content and were similar to genomic BLUP for milk yield and to genomic BLUP and PLS regression for the conception rate. In the Montbéliarde breed, no method dominated the others, except BayesCπ for fat content. The better performances of the bayesian methods for fat content in Holstein and Montbéliarde breeds are probably due to the effect of the DGAT1 gene. The SNP identified by the BayesCπ, bayesian LASSO, and sparse PLS regression methods, based on their effect on the different traits of interest, were located at almost the same position on the genome. As the bayesian methods resulted in regressions of direct genomic values on daughter trait deviations closer to 1 than for the other methods tested in this study, bayesian

  16. Predicting a Containership's Arrival Punctuality in Liner Operations by Using a Fuzzy Rule-Based Bayesian Network (FRBBN

    Directory of Open Access Journals (Sweden)

    Nurul Haqimin Mohd Salleh

    2017-07-01

    Full Text Available One of the biggest concerns in liner operations is punctuality of containerships. Managing the time factor has become a crucial issue in today's liner shipping operations. A statistic in 2015 showed that the overall punctuality for containerships only reached an on-time performance of 73%. However, vessel punctuality is affected by many factors such as the port and vessel conditions and knock-on effects of delays. As a result, this paper develops a model for analyzing and predicting the arrival punctuality of a liner vessel at ports of call under uncertain environments by using a hybrid decision-making technique, the Fuzzy Rule-Based Bayesian Network (FRBBN. In order to ensure the practicability of the model, two container vessels have been tested by using the proposed model. The results have shown that the differences between prediction values and real arrival times are only 4.2% and 6.6%, which can be considered as reasonable. This model is capable of helping liner shipping operators (LSOs to predict the arrival punctuality of their vessel at a particular port of call.

  17. Structural Design Considerations for Tubular Power Tower Receivers Operating at 650 Degrees C: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Neises, T. W.; Wagner, M. J.; Gray, A. K.

    2014-04-01

    Research of advanced power cycles has shown supercritical carbon dioxide power cycles may have thermal efficiency benefits relative to steam cycles at temperatures around 500 - 700 degrees C. To realize these benefits for CSP, it is necessary to increase the maximum outlet temperature of current tower designs. Research at NREL is investigating a concept that uses high-pressure supercritical carbon dioxide as the heat transfer fluid to achieve a 650 degrees C receiver outlet temperature. At these operating conditions, creep becomes an important factor in the design of a tubular receiver and contemporary design assumptions for both solar and traditional boiler applications must be revisited and revised. This paper discusses lessons learned for high-pressure, high-temperature tubular receiver design. An analysis of a simplified receiver tube is discussed, and the results show the limiting stress mechanisms in the tube and the impact on the maximum allowable flux as design parameters vary. Results of this preliminary analysis indicate an underlying trade-off between tube thickness and the maximum allowable flux on the tube. Future work will expand the scope of design variables considered and attempt to optimize the design based on cost and performance metrics.

  18. Uncertainty assessment via Bayesian revision of ensemble streamflow predictions in the operational river Rhine forecasting system

    NARCIS (Netherlands)

    Reggiani, P.; Renner, M.; Weerts, A.H.; Van Gelder, P.A.H.J.M.

    2009-01-01

    Ensemble streamflow forecasts obtained by using hydrological models with ensemble weather products are becoming more frequent in operational flow forecasting. The uncertainty of the ensemble forecast needs to be assessed for these products to become useful in forecasting operations. A comprehensive

  19. Uncertainty assessment via Bayesian revision of ensemble streamflow predictions in the operational river Rhine forecasting system

    NARCIS (Netherlands)

    Reggiani, P.; Renner, M.; Weerts, A.H.; Van Gelder, P.A.H.J.M.

    2009-01-01

    Ensemble streamflow forecasts obtained by using hydrological models with ensemble weather products are becoming more frequent in operational flow forecasting. The uncertainty of the ensemble forecast needs to be assessed for these products to become useful in forecasting operations. A comprehensive

  20. Adjusting for covariate effects on classification accuracy using the covariate-adjusted receiver operating characteristic curve.

    Science.gov (United States)

    Janes, Holly; Pepe, Margaret S

    2009-06-01

    Recent scientific and technological innovations have produced an abundance of potential markers that are being investigated for their use in disease screening and diagnosis. In evaluating these markers, it is often necessary to account for covariates associated with the marker of interest. Covariates may include subject characteristics, expertise of the test operator, test procedures or aspects of specimen handling. In this paper, we propose the covariate-adjusted receiver operating characteristic curve, a measure of covariate-adjusted classification accuracy. Nonparametric and semiparametric estimators are proposed, asymptotic distribution theory is provided and finite sample performance is investigated. For illustration we characterize the age-adjusted discriminatory accuracy of prostate-specific antigen as a biomarker for prostate cancer.

  1. Optimal algorithm for automatic detection of microaneurysms based on receiver operating characteristic curve

    Science.gov (United States)

    Xu, Lili; Luo, Shuqian

    2010-11-01

    Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.

  2. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data.

    Science.gov (United States)

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms.

  3. Operational modal analysis of a high-rise multi-function building with dampers by a Bayesian approach

    Science.gov (United States)

    Ni, Yanchun; Lu, Xilin; Lu, Wensheng

    2017-03-01

    The field non-destructive vibration test plays an important role in the area of structural health monitoring. It assists in monitoring the health status and reducing the risk caused by the poor performance of structures. As the most economic field test among the various vibration tests, the ambient vibration test is the most popular and is widely used to assess the physical condition of a structure under operational service. Based on the ambient vibration data, modal identification can help provide significant previous study for model updating and damage detection during the service life of a structure. It has been proved that modal identification works well in the investigation of the dynamic performance of different kinds of structures. In this paper, the objective structure is a high-rise multi-function office building. The whole building is composed of seven three-story structural units. Each unit comprises one complete floor and two L shaped floors to form large spaces along the vertical direction. There are 56 viscous dampers installed in the building to improve the energy dissipation capacity. Due to the special feature of the structure, field vibration tests and further modal identification were performed to investigate its dynamic performance. Twenty-nine setups were designed to cover all the degrees of freedom of interest. About two years later, another field test was carried out to measure the building for 48 h to investigate the performance variance and the distribution of the modal parameters. A Fast Bayesian FFT method was employed to perform the modal identification. This Bayesian method not only provides the most probable values of the modal parameters but also assesses the associated posterior uncertainty analytically, which is especially relevant in field vibration tests arising due to measurement noise, sensor alignment error, modelling error, etc. A shaking table test was also implemented including cases with and without dampers, which assists

  4. 25 CFR 47.3 - How does a Bureau-operated school find out how much funding it will receive?

    Science.gov (United States)

    2010-04-01

    ... EDUCATION UNIFORM DIRECT FUNDING AND SUPPORT FOR BUREAU-OPERATED SCHOOLS § 47.3 How does a Bureau-operated school find out how much funding it will receive? The Office of Indian Education Programs (OIEP) will... 25 Indians 1 2010-04-01 2010-04-01 false How does a Bureau-operated school find out how...

  5. Application of Receiver Operating Characteristic (ROC Curves for Explosives Detection Using Different Sampling and Detection Techniques

    Directory of Open Access Journals (Sweden)

    Mimy Young

    2013-12-01

    Full Text Available Reported for the first time are receiver operating characteristic (ROC curves constructed to describe the performance of a sorbent-coated disk, planar solid phase microextraction (PSPME unit for non-contact sampling of a variety of volatiles. The PSPME is coupled to ion mobility spectrometers (IMSs for the detection of volatile chemical markers associated with the presence of smokeless powders, model systems of explosives containing diphenylamine (DPA, 2,4-dinitrotoluene (2,4-DNT and nitroglycerin (NG as the target analytes. The performance of the PSPME-IMS was compared with the widely accepted solid-phase microextraction (SPME, coupled to a GC-MS. A set of optimized sampling conditions for different volume containers (1–45 L with various sample amounts of explosives, were studied in replicates (n = 30 to determine the true positive rates (TPR and false positive detection rates (FPR for the different scenarios. These studies were obtained in order to construct the ROC curves for two IMS instruments (a bench-top and field-portable system and a bench top GC-MS system in low and high clutter environments. Both static and dynamic PSPME sampling were studied in which 10–500 mg quantities of smokeless powders were detected within 10 min of static sampling and 1 min of dynamic sampling.

  6. Receiver operating characteristic (ROC) analysis of images reconstructed with iterative expectation maximization algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Yasuyuki; Murase, Kenya [Osaka Medical Coll., Takatsuki (Japan). Graduate School; Higashino, Hiroshi; Sogabe, Ichiro; Sakamoto, Kana

    2001-12-01

    The quality of images reconstructed by means of the maximum likelihood-expectation maximization (ML-EM) and ordered subset (OS)-EM algorithms, was examined with parameters such as the number of iterations and subsets, then compared with the quality of images reconstructed by the filtered back projection method. Phantoms showing signals inside signals, which mimicked single-photon emission computed tomography (SPECT) images of cerebral blood flow and myocardial perfusion, and phantoms showing signals around the signals obtained by SPECT of bone and tumor were used for experiments. To determine signals for recognition, SPECT images in which the signals could be appropriately recognized with a combination of fewer iterations and subsets of different sizes and densities were evaluated by receiver operating characteristic (ROC) analysis. The results of ROC analysis were applied to myocardial phantom experiments and scintigraphy of myocardial perfusion. Taking the image processing time into consideration, good SPECT images were obtained by OS-EM at iteration No. 10 and subset 5. This study will be helpful for selection of parameters such as the number of iterations and subsets when using the ML-EM or OS-EM algorithms. (author)

  7. Multivariate normally distributed biomarkers subject to limits of detection and receiver operating characteristic curve inference.

    Science.gov (United States)

    Perkins, Neil J; Schisterman, Enrique F; Vexler, Albert

    2013-07-01

    Biomarkers are of ever-increasing importance to clinical practice and epidemiologic research. Multiple biomarkers are often measured per patient. Measurement of true biomarker levels is limited by laboratory precision, specifically measuring relatively low, or high, biomarker levels resulting in undetectable levels below, or above, a limit of detection (LOD). Ignoring these missing observations or replacing them with a constant are methods commonly used although they have been shown to lead to biased estimates of several parameters of interest, including the area under the receiver operating characteristic (ROC) curve and regression coefficients. We developed asymptotically consistent, efficient estimators, via maximum likelihood techniques, for the mean vector and covariance matrix of multivariate normally distributed biomarkers affected by LOD. We also developed an approximation for the Fisher information and covariance matrix for our maximum likelihood estimations (MLEs). We apply these results to an ROC curve setting, generating an MLE for the area under the curve for the best linear combination of multiple biomarkers and accompanying confidence interval. Point and confidence interval estimates are scrutinized by simulation study, with bias and root mean square error and coverage probability, respectively, displaying behavior consistent with MLEs. An example using three polychlorinated biphenyls to classify women with and without endometriosis illustrates how the underlying distribution of multiple biomarkers with LOD can be assessed and display increased discriminatory ability over naïve methods. Properly addressing LODs can lead to optimal biomarker combinations with increased discriminatory ability that may have been ignored because of measurement obstacles. Published by Elsevier Inc.

  8. Recollection is a continuous process: Evidence from plurality memory receiver operating characteristics.

    Science.gov (United States)

    Slotnick, Scott D; Jeye, Brittany M; Dodson, Chad S

    2016-01-01

    Is recollection a continuous/graded process or a threshold/all-or-none process? Receiver operating characteristic (ROC) analysis can answer this question as the continuous model and the threshold model predict curved and linear recollection ROCs, respectively. As memory for plurality, an item's previous singular or plural form, is assumed to rely on recollection, the nature of recollection can be investigated by evaluating plurality memory ROCs. The present study consisted of four experiments. During encoding, words (singular or plural) or objects (single/singular or duplicate/plural) were presented. During retrieval, old items with the same plurality or different plurality were presented. For each item, participants made a confidence rating ranging from "very sure old", which was correct for same plurality items, to "very sure new", which was correct for different plurality items. Each plurality memory ROC was the proportion of same versus different plurality items classified as "old" (i.e., hits versus false alarms). Chi-squared analysis revealed that all of the plurality memory ROCs were adequately fit by the continuous unequal variance model, whereas none of the ROCs were adequately fit by the two-high threshold model. These plurality memory ROC results indicate recollection is a continuous process, which complements previous source memory and associative memory ROC findings.

  9. Receiver Operating Characteristic (ROC to Determine Cut-Off Points of Biomarkers in Lung Cancer Patients

    Directory of Open Access Journals (Sweden)

    Heidi L. Weiss

    2004-01-01

    Full Text Available The role of biomarkers in disease prognosis continues to be an important investigation in many cancer studies. In order for these biomarkers to have practical application in clinical decision making regarding patient treatment and follow-up, it is common to dichotomize patients into those with low vs. high expression levels. In this study, receiver operating characteristic (ROC curves, area under the curve (AUC of the ROC, sensitivity, specificity, as well as likelihood ratios were calculated to determine levels of growth factor biomarkers that best differentiate lung cancer cases versus control subjects. Selected cut-off points for p185erbB-2 and EGFR membrane appear to have good discriminating power to differentiate control tissues versus uninvolved tissues from patients with lung cancer (AUC = 89% and 90%, respectively; while AUC increased to at least 90% for selected cut-off points for p185erbB-2 membrane, EGFR membrane, and FASE when comparing between control versus carcinoma tissues from lung cancer cases. Using data from control subjects compared to patients with lung cancer, we presented a simple and intuitive approach to determine dichotomized levels of biomarkers and validated the value of these biomarkers as surrogate endpoints for cancer outcome.

  10. Measuring diagnostic and predictive accuracy in disease management: an introduction to receiver operating characteristic (ROC) analysis.

    Science.gov (United States)

    Linden, Ariel

    2006-04-01

    Diagnostic or predictive accuracy concerns are common in all phases of a disease management (DM) programme, and ultimately play an influential role in the assessment of programme effectiveness. Areas, such as the identification of diseased patients, predictive modelling of future health status and costs and risk stratification, are just a few of the domains in which assessment of accuracy is beneficial, if not critical. The most commonly used analytical model for this purpose is the standard 2 x 2 table method in which sensitivity and specificity are calculated. However, there are several limitations to this approach, including the reliance on a single defined criterion or cut-off for determining a true-positive result, use of non-standardized measurement instruments and sensitivity to outcome prevalence. This paper introduces the receiver operator characteristic (ROC) analysis as a more appropriate and useful technique for assessing diagnostic and predictive accuracy in DM. Its advantages include; testing accuracy across the entire range of scores and thereby not requiring a predetermined cut-off point, easily examined visual and statistical comparisons across tests or scores, and independence from outcome prevalence. Therefore the implementation of ROC as an evaluation tool should be strongly considered in the various phases of a DM programme.

  11. Design of a receiver operating characteristic (ROC) study of 10:1 lossy image compression

    Science.gov (United States)

    Collins, Cary A.; Lane, David; Frank, Mark S.; Hardy, Michael E.; Haynor, David R.; Smith, Donald V.; Parker, James E.; Bender, Gregory N.; Kim, Yongmin

    1994-04-01

    The digital archiving system at Madigan Army Medical Center (MAMC) uses a 10:1 lossy data compression algorithm for most forms of computed radiography. A systematic study on the potential effect of lossy image compression on patient care has been initiated with a series of studies focused on specific diagnostic tasks. The studies are based upon the receiver operating characteristic (ROC) method of analysis for diagnostic systems. The null hypothesis is that observer performance with approximately 10:1 compressed and decompressed images is not different from using original, uncompressed images for detecting subtle pathologic findings seen on computed radiographs of bone, chest, or abdomen, when viewed on a high-resolution monitor. Our design involves collecting cases from eight pathologic categories. Truth is determined by committee using confirmatory studies performed during routine clinical practice whenever possible. Software has been developed to aid in case collection and to allow reading of the cases for the study using stand-alone Siemens Litebox workstations. Data analysis uses two methods, ROC analysis and free-response ROC (FROC) methods. This study will be one of the largest ROC/FROC studies of its kind and could benefit clinical radiology practice using PACS technology. The study design and results from a pilot FROC study are presented.

  12. Visualization of the significance of Receiver Operating Characteristics based on confidence ellipses

    Science.gov (United States)

    Sarlis, Nicholas V.; Christopoulos, Stavros-Richard G.

    2014-03-01

    The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Catalogue identifier: AERY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 11511 No. of bytes in distributed program, including test data, etc.: 72906 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any computer supporting a GNU FORTRAN compiler. Operating system: Linux, MacOS, Windows. RAM: 1Mbyte Classification: 4.13, 9, 14. Nature of problem: The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Solution method: Using the statistics of random binary predictions for a given value of the predictor threshold ɛt, one can construct the corresponding confidence ellipses. The envelope of these corresponding confidence ellipses is estimated when

  13. Acousto-optic, point receiver hydrophone probe for operation up to 100 MHz.

    Science.gov (United States)

    Lewin, P A; Mu, C; Umchid, S; Daryoush, A; El-Sherif, M

    2005-12-01

    This work describes the results of initial evaluation of a wideband acousto-optic hydrophone probe designed to operate as point receiver in the frequency range up to 100 MHz. The hydrophone was implemented as a tapered fiber optic (FO) probe sensor with a tip diameter of approximately 7 microm. Such small physical dimensions of the sensor eliminate the need for spatial averaging corrections so that true pressure-time (p-t) waveforms can be faithfully recorded. The theoretical considerations that predicted the FO probe sensitivity to be equal to 4.3 mV/MPa are presented along with a brief description of the manufacturing process. The calibration results that verified the theoretically predicted sensitivity are also presented along with a brief description of the improvements being currently implemented to increase this sensitivity level by approximately 20 dB. The results of preliminary measurements indicate that the fiber optic probes will exhibit a uniform frequency response and a zero phase shift in the frequency range considered. These features might be very useful in rapid complex calibration i.e. determining both magnitude and phase response of other hydrophones by the substitution method. Also, because of their robust design and linearity, these fiber optic hydrophones could also meet the challenges posed by high intensity focused ultrasound (HIFU) and other therapeutic applications. Overall, the outcome of this work shows that when fully developed, the FO probes will be well suited for high frequency measurements of ultrasound fields and will be able to complement the data collected by the current finite aperture piezoelectric PVDF hydrophones.

  14. Covariate Adjustment in Receiver Operating Characteristic Curve Analysis for PSA in Diagnosis of Prostate Cancer

    Directory of Open Access Journals (Sweden)

    Emre DEMIR

    2016-10-01

    Full Text Available Objective: Markers which are used for classification into two groups, such as patient / healthy, benign/malignant or prediction of optimal cut off value for diagnostic test and evaluating the performance of diagnostic tests is evaluated by Receiver Operating Characteristic (ROC curve in the diagnostic test researches. In classification accuracy research, some variables such as gender and age, commonly is not similar in groups. In these cases, covariates should be considered to estimate in the area under ROC and covariate adjustment for ROC should be performed. This study aims to introduce methods in the literature for the effect of covariate adjustment and to present an application with sample from the health field. Material and Methods: In the study, we introduced methods used in the literatüre for covariate adjustment and prediction of the area under ROC curves as well as an application with data from the field of urology. In this study, 105 PSA (prostate specific antigen measurements were taken in order to examine the covariate effect for the age variable and to assess the diagnostic performance of PSA measurements with regard to pathologic methods. Results: Covariate effect were found statistically significant with 0.733 parameter estimation of the age in ROC curves analysis with PSA data (p<0.001. According to the methods (Non-parametric (empirical, non-parametric (normal, semi-parametric (empirical, parametric (normal that estimates of the area under ROC curves which is obtained without covariate effect were found 0.708, 0.629, 0.709 and 0.628, respectively, by using PSA measurements. Area under the curve that obtained by covariate adjustment were significantly lower as compared to the traditional ROC with estimation 0.580, 0.577, 0.582 and 0.579. Conclusion: Area under the ROC curves should be estimated with adjustment according to the covariates that could affect the markers value of diagnostic tests performed in concert with matching

  15. Receiver-operating characteristic curves for somatic cell scores and California mastitis test in Valle del Be lice dairy sheep

    NARCIS (Netherlands)

    Riggio, V.; Pesce, L.L.; Morreale, S.; Portolano, B.

    2013-01-01

    Using receiver-operating characteristic (ROC) curve methodology this study was designed to assess the diagnostic effectiveness of somatic cell count (SCC) and the California mastitis test (CMT) in Valle del Belice sheep, and to propose and evaluate threshold values for those tests that would optimal

  16. Receiver-operating characteristic curves for somatic cell scores and California mastitis test in Valle del Be lice dairy sheep

    NARCIS (Netherlands)

    Riggio, V.; Pesce, L.L.; Morreale, S.; Portolano, B.

    2013-01-01

    Using receiver-operating characteristic (ROC) curve methodology this study was designed to assess the diagnostic effectiveness of somatic cell count (SCC) and the California mastitis test (CMT) in Valle del Belice sheep, and to propose and evaluate threshold values for those tests that would

  17. The precision--recall curve overcame the optimism of the receiver operating characteristic curve in rare diseases

    DEFF Research Database (Denmark)

    Ozenne, Brice; Subtil, Fabien; Maucort-Boulch, Delphine

    2015-01-01

    OBJECTIVES: Compare the area under the receiver operating characteristic curve (AUC) vs. the area under the precision-recall curve (AUPRC) in summarizing the performance of a diagnostic biomarker according to the disease prevalence. STUDY DESIGN AND SETTING: A simulation study was performed...

  18. Assessing the Classification Accuracy of Early Numeracy Curriculum-Based Measures Using Receiver Operating Characteristic Curve Analysis

    Science.gov (United States)

    Laracy, Seth D.; Hojnoski, Robin L.; Dever, Bridget V.

    2016-01-01

    Receiver operating characteristic curve (ROC) analysis was used to investigate the ability of early numeracy curriculum-based measures (EN-CBM) administered in preschool to predict performance below the 25th and 40th percentiles on a quantity discrimination measure in kindergarten. Areas under the curve derived from a sample of 279 students ranged…

  19. Bayesian Intersubjectivity and Quantum Theory

    Science.gov (United States)

    Pérez-Suárez, Marcos; Santos, David J.

    2005-02-01

    Two of the major approaches to probability, namely, frequentism and (subjectivistic) Bayesian theory, are discussed, together with the replacement of frequentist objectivity for Bayesian intersubjectivity. This discussion is then expanded to Quantum Theory, as quantum states and operations can be seen as structural elements of a subjective nature.

  20. Bayesian probabilistic modeling for damage assessment in a bolted frame

    Science.gov (United States)

    Haynes, Colin; Todd, Michael

    2012-04-01

    This paper presents the development of a Bayesian framework for optimizing the design of a structural health monitoring (SHM) system. Statistical damage detection techniques are applied to a geometrically-complex, three-story structure with bolted joints. A sparse network of PZT sensor-actuators is bonded to the structure, using ultrasonic guided waves in both pulse-echo and pitch-catch modes to inspect the structure. Receiver operating characteristics are used to quantify the performance of multiple features (or detectors). The detection rate of the system is compared across different types and levels of damage. A Bayesian cost model is implemented to determine the best performing network.

  1. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  2. Detector evaluation for improved situational awareness: Receiver operator characteristic curve based

    NARCIS (Netherlands)

    Wuijckhuijse, A.L. van; Nieuwenhuizen, M.S.

    2016-01-01

    In military and civilian operations good situational awareness is a prerequisite to make proper decisions. The situational awareness is among others based upon intelligence, threat analysis and detection, altogether element of the so-called DIM (detection, identification, monitoring) system. In case

  3. Detector evaluation for improved situational awareness: Receiver operator characteristic curve based

    NARCIS (Netherlands)

    Wuijckhuijse, A.L. van; Nieuwenhuizen, M.S.

    2016-01-01

    In military and civilian operations good situational awareness is a prerequisite to make proper decisions. The situational awareness is among others based upon intelligence, threat analysis and detection, altogether element of the so-called DIM (detection, identification, monitoring) system. In case

  4. Bayesian statistics

    OpenAIRE

    新家, 健精

    2013-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  5. A Bayesian Method for Short-Term Probabilistic Forecasting of Photovoltaic Generation in Smart Grid Operation and Control

    Directory of Open Access Journals (Sweden)

    Gabriella Ferruzzi

    2013-02-01

    Full Text Available A new short-term probabilistic forecasting method is proposed to predict the probability density function of the hourly active power generated by a photovoltaic system. Firstly, the probability density function of the hourly clearness index is forecasted making use of a Bayesian auto regressive time series model; the model takes into account the dependence of the solar radiation on some meteorological variables, such as the cloud cover and humidity. Then, a Monte Carlo simulation procedure is used to evaluate the predictive probability density function of the hourly active power by applying the photovoltaic system model to the random sampling of the clearness index distribution. A numerical application demonstrates the effectiveness and advantages of the proposed forecasting method.

  6. Bayesian Theory

    CERN Document Server

    Bernardo, Jose M

    2000-01-01

    This highly acclaimed text, now available in paperback, provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Information-theoretic concepts play a central role in the development of the theory, which provides, in particular, a detailed discussion of the problem of specification of so-called prior ignorance . The work is written from the authors s committed Bayesian perspective, but an overview of non-Bayesian theories is also provided, and each chapter contains a wide-ranging critica

  7. Bayesian SPLDA

    OpenAIRE

    Villalba, Jesús

    2015-01-01

    In this document we are going to derive the equations needed to implement a Variational Bayes estimation of the parameters of the simplified probabilistic linear discriminant analysis (SPLDA) model. This can be used to adapt SPLDA from one database to another with few development data or to implement the fully Bayesian recipe. Our approach is similar to Bishop's VB PPCA.

  8. Impact of varied center volume categories on volume-outcome relationship in children receiving ECMO for heart operations.

    Science.gov (United States)

    Rettiganti, Mallikarjuna; Seib, Paul M; Robertson, Michael J; Wilcox, Andrew; Gupta, Punkaj

    2016-09-01

    To study the volume-outcome relationship among children receiving extracorporeal membrane oxygenation (ECMO), different studies from different databases use different volume categories. The objective of this study was to evaluate if different center volume categories impact the volume-outcome relationship among children receiving ECMO for heart operations. We performed a post hoc analysis of data from an existing national database, the Pediatric Health Information System. Centers were classified into five different volume categories using different cut-offs and different variables. Mortality rates were compared between the varied volume categories using a mixed effects logistic regression model after adjusting for patient- and center-level risk factors. Data collection included demographic information, baseline characteristics, pre-ECMO risk factors, operation details, patient diagnoses, and center data. In unadjusted analysis, there was a significant relationship between center volume and mortality, with low-and medium-volume centers associated with higher mortality rates compared to high-volume centers in all volume categories, except the hierarchical clustering volume category. In contrast, there was no significant association between center-volume and mortality among all volume categories in adjusted analysis. We concluded that high-volume centers were not associated with improved outcomes for the majority of the categorization schemes despite using different cut-offs and different variables for volume categorization.

  9. Interrelationships Between Receiver/Relative Operating Characteristics Display, Binomial, Logit, and Bayes' Rule Probability of Detection Methodologies

    Science.gov (United States)

    Generazio, Edward R.

    2014-01-01

    Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.

  10. Safety analysis, 200 Area, Savannah River Plant: Separations area operations. Receiving Basin for Offsite Fuel (Supplement 3)

    Energy Technology Data Exchange (ETDEWEB)

    Allen, P M

    1983-09-01

    Analysis of the Savannah River Plant RBOF and RRF included an evaluation of the reliability of process equipment and controls, administrative controls, and engineered safety features. The evaluation also identified potential scenarios and radiological consequences. Risks were calculated in terms of 50-year population dose commitment per year (man-rem/year) to the onsite and offsite population within an 80 Km radius of RBOF and RRF, and to an individual at the plant boundary. The total 50-year onsite and offsite population radiological risks of operating the RBOF and RRF were estimated to be 1.0 man-rem/year. These risks are significantly less than the population dose of 54,000 man/rem/yr for natural background radiation in a 50-mile radius. The 50-year maximum offsite individual risk from operating the facility was estimated to be 2.1 {times} 10{sup 5} rem/yr. These risks are significantly lower than 93 mrem/yr an individual is expected to receive from natural background radiation in this area. The analysis shows. that the RBOF and RRF can be operated without undue risk to onsite personnel or to the general public.

  11. Bayesian Monitoring.

    OpenAIRE

    Kirstein, Roland

    2005-01-01

    This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...

  12. Estimation of doses received by operators in the 1958 RB reactor accident using the MCNP5 computer code simulation

    Directory of Open Access Journals (Sweden)

    Pešić Milan P.

    2012-01-01

    Full Text Available A numerical simulation of the radiological consequences of the RB reactor reactivity excursion accident, which occurred on October 15, 1958, and an estimation of the total doses received by the operators were run by the MCNP5 computer code. The simulation was carried out under the same assumptions as those used in the 1960 IAEA-organized experimental simulation of the accident: total fission energy of 80 MJ released in the accident and the frozen positions of the operators. The time interval of exposure to high doses received by the operators has been estimated. Data on the RB1/1958 reactor core relevant to the accident are given. A short summary of the accident scenario has been updated. A 3-D model of the reactor room and the RB reactor tank, with all the details of the core, created. For dose determination, 3-D simplified, homogenised, sexless and faceless phantoms, placed inside the reactor room, have been developed. The code was run for a number of neutron histories which have given a dose rate uncertainty of less than 2%. For the determination of radiation spectra escaping the reactor core and radiation interaction in the tissue of the phantoms, the MCNP5 code was run (in the KCODE option and “mode n p e”, with a 55-group neutron spectra, 35-group gamma ray spectra and a 10-group electron spectra. The doses were determined by using the conversion of flux density (obtained by the F4 tally in the phantoms to doses using factors taken from ICRP-74 and from the deposited energy of neutrons and gamma rays (obtained by the F6 tally in the phantoms’ tissue. A rough estimation of the time moment when the odour of ozone was sensed by the operators is estimated for the first time and given in Appendix A.1. Calculated total absorbed and equivalent doses are compared to the previously reported ones and an attempt to understand and explain the reasons for the obtained differences has been made. A Root Cause Analysis of the accident was done and

  13. Approaches for delineating landslide hazard areas using receiver operating characteristic in an advanced calibrating precision soil erosion model

    Directory of Open Access Journals (Sweden)

    P. T. Ghazvinei

    2015-10-01

    Full Text Available Soil erosion is undesirable natural event that causes land degradation and desertification. Identify the erosion-prone areas is a major component of preventive measures. Recent landslide damages at different regions lead us to develop a model of the erosion susceptibility map using empirical method (RUSLE. A landslide-location map was established by interpreting satellite image. Field observation data was used to validate the intensity of soil erosion. Further, a correlation analysis was conducted to investigate the "Receiver Operating Characteristic" and frequency ratio. Results showed a satisfactory correlation between the prepared RUSLE-based soil erosion map and actual landslide distribution. The proposed model can effectively predict the landslide events in soil-erosion area. Such a reliable predictive model is an effective management facility for the regional landslide forecasting system.

  14. [Receiver operating characteristic analysis and the cost--benefit analysis in determination of the optimal cut-off point].

    Science.gov (United States)

    Vránová, J; Horák, J; Krátká, K; Hendrichová, M; Kovaírková, K

    2009-01-01

    An overview of the use of Receiver Operating Characteristic (ROC) analysis within medicine is provided. A survey of the theory behind the analysis is offered together with a presentation on how to create a ROC curve and how to use Cost--Benefit analysis to determine the optimal cut-off point or threshold. The use of ROC analysis is exemplified in the "Cost--Benefit analysis" section of the paper. In these examples, it can be seen that the determination of the optimal cut-off point is mainly influenced by the prevalence and the severity of the disease, by the risks and adverse events of treatment or the diagnostic testing, by the overall costs of treating true and false positives (TP and FP), and by the risk of deficient or non-treatment of false negative (FN) cases.

  15. Receiver-Operating-Characteristic Analysis Reveals Superiority of Scale-Dependent Wavelet and Spectral Measures for Assessing Cardiac Dysfunction

    CERN Document Server

    Thurner, S; Lowen, S B; Teich, M C; Thurner, Stefan; Feurstein, Markus C.; Lowen, Steven B.; Teich, Malvin C.

    1998-01-01

    Receiver-operating-characteristic (ROC) analysis was used to assess the suitability of various heart rate variability (HRV) measures for correctly classifying electrocardiogram records of varying lengths as normal or revealing the presence of heart failure. Scale-dependent HRV measures were found to be substantially superior to scale-independent measures (scaling exponents) for discriminating the two classes of data over a broad range of record lengths. The wavelet-coefficient standard deviation at a scale near 32 heartbeat intervals, and its spectral counterpart near 1/32 cycles per interval, provide reliable results using record lengths just minutes long. A jittered integrate-and-fire model built around a fractal Gaussian-noise kernel provides a realistic, though not perfect, simulation of heartbeat sequences.

  16. Experimental Design and Data Analysis in Receiver Operating Characteristic Studies: Lessons Learned from Reports in Radiology from 1997 to 20061

    Science.gov (United States)

    Shiraishi, Junji; Pesce, Lorenzo L.; Metz, Charles E.; Doi, Kunio

    2009-01-01

    Purpose: To provide a broad perspective concerning the recent use of receiver operating characteristic (ROC) analysis in medical imaging by reviewing ROC studies published in Radiology between 1997 and 2006 for experimental design, imaging modality, medical condition, and ROC paradigm. Materials and Methods: Two hundred ninety-five studies were obtained by conducting a literature search with PubMed with two criteria: publication in Radiology between 1997 and 2006 and occurrence of the phrase “receiver operating characteristic.” Studies returned by the query that were not diagnostic imaging procedure performance evaluations were excluded. Characteristics of the remaining studies were tabulated. Results: Two hundred thirty-three (79.0%) of the 295 studies reported findings based on observers' diagnostic judgments or objective measurements. Forty-three (14.6%) did not include human observers, with most of these reporting an evaluation of a computer-aided diagnosis system or functional data obtained with computed tomography (CT) or magnetic resonance (MR) imaging. The remaining 19 (6.4%) studies were classified as reviews or meta-analyses and were excluded from our subsequent analysis. Among the various imaging modalities, MR imaging (46.0%) and CT (25.7%) were investigated most frequently. Approximately 60% (144 of 233) of ROC studies with human observers published in Radiology included three or fewer observers. Conclusion: ROC analysis is widely used in radiologic research, confirming its fundamental role in assessing diagnostic performance. However, the ROC studies reported in Radiology were not always adequate to support clear and clinically relevant conclusions. © RSNA, 2009 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.2533081632/-/DC1 PMID:19864510

  17. Bayesian deterministic decision making: A normative account of the operant matching law and heavy-tailed reward history dependency of choices

    Directory of Open Access Journals (Sweden)

    Hiroshi eSaito

    2014-03-01

    Full Text Available The decision making behaviors of humans and animals adapt and then satisfy an ``operant matching law'' in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.

  18. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  19. SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE

    Institute of Scientific and Technical Information of China (English)

    Ming HAN; Yuanyao DING

    2004-01-01

    This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.

  20. Estimated cumulative radiation dose received by diagnostic imaging during staging and treatment of operable Ewing sarcoma 2005-2012

    Energy Technology Data Exchange (ETDEWEB)

    Johnsen, Boel [Haukeland University Hospital, Centre for Nuclear Medicine and PET, Department of Radiology, P.O. Box 1400, Bergen (Norway); Fasmer, Kristine Eldevik [Haukeland University Hospital, Department of Oncology, Medical Physics Section, Bergen (Norway); Boye, Kjetil [Norwegian Radium Hospital, Oslo University Hospital, Department of Oncology, Oslo (Norway); Rosendahl, Karen; Aukland, Stein Magnus [Haukeland University Hospital, Department of Radiology, Paediatric Section, Bergen (Norway); University of Bergen, Department of Clinical Medicine, Bergen (Norway); Trovik, Clement [University of Bergen, Department of Clinical Medicine, Bergen (Norway); Haukeland University Hospital, Department of Surgery, Orthopaedic Section, Bergen (Norway); Biermann, Martin [Haukeland University Hospital, Centre for Nuclear Medicine and PET, Department of Radiology, P.O. Box 1400, Bergen (Norway); University of Bergen, Department of Clinical Medicine, Bergen (Norway)

    2017-01-15

    Patients with Ewing sarcoma are subject to various diagnostic procedures that incur exposure to ionising radiation. To estimate the radiation doses received from all radiologic and nuclear imaging episodes during diagnosis and treatment, and to determine whether {sup 18}F-fluorodeoxyglucose positron emission tomography - computed tomography ({sup 18}F-FDG PET-CT) is a major contributor of radiation. Twenty Ewing sarcoma patients diagnosed in Norway in 2005-2012 met the inclusion criteria (age <30 years, operable disease, uncomplicated chemotherapy and surgery, no metastasis or residual disease within a year of diagnosis). Radiation doses from all imaging during the first year were calculated for each patient. The mean estimated cumulative radiation dose for all patients was 34 mSv (range: 6-70), radiography accounting for 3 mSv (range: 0.2-12), CT for 13 mSv (range: 2-28) and nuclear medicine for 18 mSv (range: 2-47). For the patients examined with PET-CT, the mean estimated cumulative effective dose was 38 mSv, of which PET-CT accounted for 14 mSv (37%). There was large variation in number and type of examinations performed and also in estimated cumulative radiation dose. The mean radiation dose for patients examined with PET-CT was 23% higher than for patients not examined with PET-CT. (orig.)

  1. Receiver-operating characteristic curves for somatic cell scores and California mastitis test in Valle del Belice dairy sheep.

    Science.gov (United States)

    Riggio, Valentina; Pesce, Lorenzo L; Morreale, Salvatore; Portolano, Baldassare

    2013-06-01

    Using receiver-operating characteristic (ROC) curve methodology this study was designed to assess the diagnostic effectiveness of somatic cell count (SCC) and the California mastitis test (CMT) in Valle del Belice sheep, and to propose and evaluate threshold values for those tests that would optimally discriminate between healthy and infected udders. Milk samples (n=1357) were collected from 684 sheep in four flocks. The prevalence of infection, as determined by positive bacterial culture was 0.36, 87.7% of which were minor and 12.3% major pathogens. Of the culture negative samples, 83.7% had an SCCCMT results were evaluated, the estimated area under the ROC curve was greater for glands infected with major compared to minor pathogens (0.88 vs. 0.73), whereas the area under the curve considering all pathogens was similar to the one for minor pathogens (0.75). The estimated optimal thresholds were 3.00 (CMT), 2.81 (SCS for the whole sample), 2.81 (SCS for minor pathogens), and 3.33 (SCS for major pathogens). These correctly classified, respectively, 69.0%, 73.5%, 72.6% and 91.0% of infected udders in the samples. The CMT appeared only to discriminate udders infected with major pathogens. In this population, SCS appeared to be the best indirect test of the bacteriological status of the udder. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Diagnostic accuracy of serum biochemical fibrosis markers in children with chronic hepatitis B evaluated by receiver operating characteristics analysis.

    Science.gov (United States)

    Lebensztejn, Dariusz Marek; Skiba, Elzbieta; Tobolczyk, Jolanta; Sobaniec-Lotowska, Maria Elzbieta; Kaczmarski, Maciej

    2005-12-07

    To investigate the diagnostic accuracy of potent serum biochemical fibrosis markers in children with chronic hepatitis B evaluated by receiver operating characteristics (ROC) analysis. We determined the serum level of apolipoprotein A-I (APO A-I), haptoglobin (HPT) and a-2 macroglobulin (A2M) with an automatic nephelometer in 63 children (age range 4-17 years, mean 10 years) with biopsy-verified chronic HBeAg-positive hepatitis B. Fibrosis stage and inflammation grade were assessed in a blinded fashion according to Batts and Ludwig. We defined mild liver fibrosis as a score < or =2 and advanced fibrosis as a score equal to 3. ROC analysis was used to calculate the power of the assays to detect advanced liver fibrosis (AccuROC, Canada). Serum concentrations of APO A-I, HPT and A2M were not significantly different in patients with chronic hepatitis B compared to controls. However, APO A-I level of 1.19 ng/L had a sensitivity of 85.7% and a specificity of 60.7% (AUC = 0.7117, P = 0.035) to predict advanced fibrosis. All other serum biochemical markers and their combination did not allow a useful prediction. None of these markers was a good predictor of histologic inflammation. Apolipoprotein A-I may be a suitable serum marker to predict advanced liver fibrosis in children with chronic hepatitis B.

  3. Diagnostic accuracy of serum biochemical fibrosis markers in children with chronic hepatitis B evaluated by receiver operating characteristics analysis

    Institute of Scientific and Technical Information of China (English)

    Dariusz Marek Lebensztejn; El(z)bieta Skiba; Jolanta Tobolczyk; Maria El(z)bieta Sobaniec-Lotowska; Maciej Kaczmarski

    2005-01-01

    AIM: To investigate the diagnostic accuracy of potent serum biochemical fibrosis markers in children with chronic hepatitis B evaluated by receiver operating characteristics (ROC) analysis.METHODS: We determined the serum level of apolipoprotein A-I (APO A-I), haptoglobin (HPT) and a-2macroglobulin (A2M) with an automatic nephelometer in 63 children (age range 4-17 years, mean 10 years)with biopsy-verified chronic HBeAg-positive hepatitis B.Fibrosis stage and inflammation grade were assessed in a blinded fashion according to Batts and Ludwig. We defined mild liver fibrosis as a score ≤2 and advanced fibrosis as a score equal to 3. ROC analysis was used to calculate the power of the assays to detect advanced liver fibrosis (AccuROC, Canada).RESULTS: Serum concentrations of APO A-I, HPT and A2M were not significantly different in patients with chronic hepatitis B compared to controls. However, APO A-I level of 1.19 ng/L had a sensitivity of 85.7% and a specificity of 60.7% (AUC = 0.7117, P = 0.035) to predict advanced fibrosis. All other serum biochemical markers and their combination did not allow a useful prediction.None of these markers was a good predictor of histologic inflammation.CONCLUSION: Apolipoprotein A-I may be a suitable serum marker to predict advanced liver fibrosis in children with chronic hepatitis B.

  4. Summary receiver operating characteristics (SROC) and hierarchical SROC models for analysis of diagnostic test evaluations of antibody ELISAs for paratuberculosis.

    Science.gov (United States)

    Toft, Nils; Nielsen, Søren S

    2009-11-15

    Critical, systematic reviews of available diagnostic test evaluations are a meticulous approach to synthesize evidence about a diagnostic test. However, often the review finds that data quality is poor due to deficiencies in design and reporting of the test evaluations and formal statistical comparisons are discouraged. Even when only simple summary measures are appropriate, the strong correlation between sensitivity and specificity and their dependence on differences in diagnostic threshold across studies, creates the need for tools to summarise properties of the diagnostic test under investigation. This study presents summary receiver operating characteristics (SROC) analysis as a means to synthesize information from diagnostic test evaluation studies. Using data from a review of diagnostic tests for ante mortem diagnosis of paratuberculosis as an illustration, SROC and hierarchical SROC (HSROC) analysis were used to estimate overall diagnostic accuracies of antibody ELISAs for bovine paratuberculosis while accounting for covariates: the target condition (infectious or infected) used in the test evaluation (one for the evaluation of Se and one for Sp); and the type of test (serum vs. milk). The methods gave comparable results (regarding the estimated diagnostic log odds ratio), considering the small sample size and the quality of data. The SROC analysis found a difference in the performance of tests when the target condition for evaluation of Se was infected rather than infectious, suggesting that ELISAs are not suitable for detecting infected cattle. However, the SROC model does not take differences in sample size between study units into account, whereas the HSROC allows for both between and within study variation. Considering the small sample size, more credibility should be given to the results of the HSROC. For both methods the area under the (H)SROC curve was calculated and results were comparable. The conclusion is that while the SROC is simpler and easier

  5. BNFinder2: Faster Bayesian network learning and Bayesian classification.

    Science.gov (United States)

    Dojer, Norbert; Bednarz, Pawel; Podsiadlo, Agnieszka; Wilczynski, Bartek

    2013-08-15

    Bayesian Networks (BNs) are versatile probabilistic models applicable to many different biological phenomena. In biological applications the structure of the network is usually unknown and needs to be inferred from experimental data. BNFinder is a fast software implementation of an exact algorithm for finding the optimal structure of the network given a number of experimental observations. Its second version, presented in this article, represents a major improvement over the previous version. The improvements include (i) a parallelized learning algorithm leading to an order of magnitude speed-ups in BN structure learning time; (ii) inclusion of an additional scoring function based on mutual information criteria; (iii) possibility of choosing the resulting network specificity based on statistical criteria and (iv) a new module for classification by BNs, including cross-validation scheme and classifier quality measurements with receiver operator characteristic scores. BNFinder2 is implemented in python and freely available under the GNU general public license at the project Web site https://launchpad.net/bnfinder, together with a user's manual, introductory tutorial and supplementary methods.

  6. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  7. Machine learning-based receiver operating characteristic (ROC) curves for crisp and fuzzy classification of DNA microarrays in cancer research.

    Science.gov (United States)

    Peterson, Leif E; Coleman, Matthew A

    2008-01-01

    Receiver operating characteristic (ROC) curves were generated to obtain classification area under the curve (AUC) as a function of feature standardization, fuzzification, and sample size from nine large sets of cancer-related DNA microarrays. Classifiers used included k nearest neighbor (kNN), näive Bayes classifier (NBC), linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), learning vector quantization (LVQ1), logistic regression (LOG), polytomous logistic regression (PLOG), artificial neural networks (ANN), particle swarm optimization (PSO), constricted particle swarm optimization (CPSO), kernel regression (RBF), radial basis function networks (RBFN), gradient descent support vector machines (SVMGD), and least squares support vector machines (SVMLS). For each data set, AUC was determined for a number of combinations of sample size, total sum[-log(p)] of feature t-tests, with and without feature standardization and with (fuzzy) and without (crisp) fuzzification of features. Altogether, a total of 2,123,530 classification runs were made. At the greatest level of sample size, ANN resulted in a fitted AUC of 90%, while PSO resulted in the lowest fitted AUC of 72.1%. AUC values derived from 4NN were the most dependent on sample size, while PSO was the least. ANN depended the most on total statistical significance of features used based on sum[-log(p)], whereas PSO was the least dependent. Standardization of features increased AUC by 8.1% for PSO and -0.2% for QDA, while fuzzification increased AUC by 9.4% for PSO and reduced AUC by 3.8% for QDA. AUC determination in planned microarray experiments without standardization and fuzzification of features will benefit the most if CPSO is used for lower levels of feature significance (i.e., sum[-log(p)] ~ 50) and ANN is used for greater levels of significance (i.e., sum[-log(p)] ~ 500). When only standardization of features is performed, studies are likely to benefit most by using CPSO for low levels

  8. Bayesian Methods and Universal Darwinism

    CERN Document Server

    Campbell, John

    2010-01-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...

  9. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  10. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  11. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...... is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  12. Receiver-operating characteristic curves and likelihood ratios: improvements over traditional methods for the evaluation and application of veterinary clinical pathology tests

    DEFF Research Database (Denmark)

    Gardner, Ian A.; Greiner, Matthias

    2006-01-01

    Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or differ...

  13. Summary receiver operating characteristic curves as a technique for meta-analysis of the diagnostic performance of duplex ultrasonography in peripheral arterial disease

    NARCIS (Netherlands)

    deVries, SO; Hunink, MGM; Polak, JF

    1996-01-01

    Rationale and Objectives. We summarized and compared the diagnostic performance of duplex and color-guided duplex ultrasonography in the evaluation of peripheral arterial disease. We present our research as an example of the use of summary receiver operating characteristic (ROC) curves in a meta-ana

  14. Applied Bayesian Hierarchical Methods

    CERN Document Server

    Congdon, Peter D

    2010-01-01

    Bayesian methods facilitate the analysis of complex models and data structures. Emphasizing data applications, alternative modeling specifications, and computer implementation, this book provides a practical overview of methods for Bayesian analysis of hierarchical models.

  15. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  16. Bayesian Persuasion

    OpenAIRE

    Emir Kamenica; Matthew Gentzkow

    2009-01-01

    When is it possible for one person to persuade another to change her action? We take a mechanism design approach to this question. Taking preferences and initial beliefs as given, we introduce the notion of a persuasion mechanism: a game between Sender and Receiver defined by an information structure and a message technology. We derive necessary and sufficient conditions for the existence of a persuasion mechanism that strictly benefits Sender. We characterize the optimal mechanism. Finally, ...

  17. [Meta analysis of the use of Bayesian networks in breast cancer diagnosis].

    Science.gov (United States)

    Simões, Priscyla Waleska; Silva, Geraldo Doneda da; Moretti, Gustavo Pasquali; Simon, Carla Sasso; Winnikow, Erik Paul; Nassar, Silvia Modesto; Medeiros, Lidia Rosi; Rosa, Maria Inês

    2015-01-01

    The aim of this study was to determine the accuracy of Bayesian networks in supporting breast cancer diagnoses. Systematic review and meta-analysis were carried out, including articles and papers published between January 1990 and March 2013. We included prospective and retrospective cross-sectional studies of the accuracy of diagnoses of breast lesions (target conditions) made using Bayesian networks (index test). Four primary studies that included 1,223 breast lesions were analyzed, 89.52% (444/496) of the breast cancer cases and 6.33% (46/727) of the benign lesions were positive based on the Bayesian network analysis. The area under the curve (AUC) for the summary receiver operating characteristic curve (SROC) was 0.97, with a Q* value of 0.92. Using Bayesian networks to diagnose malignant lesions increased the pretest probability of a true positive from 40.03% to 90.05% and decreased the probability of a false negative to 6.44%. Therefore, our results demonstrated that Bayesian networks provide an accurate and non-invasive method to support breast cancer diagnosis.

  18. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  19. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  20. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    Science.gov (United States)

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…

  1. Bayesian Games with Intentions

    Directory of Open Access Journals (Sweden)

    Adam Bjorndahl

    2016-06-01

    Full Text Available We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.

  2. Picturing classical and quantum Bayesian inference

    CERN Document Server

    Coecke, Bob

    2011-01-01

    We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers density operators rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. We characterize classical Bayesian inference in terms of a graphical property and demonstrate that our approach eliminates some purely conventional elements that appear in common representations thereof, such as whether degrees of belief are represented by probabilities or entropic quantities. We also introduce a quantum-like calculus wherein the Frobenius structure is noncommutative and show that it can accommodate Leifer's calculus of `cond...

  3. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  4. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  5. The commercial cycle from the viewpoint of operant behavioral economics: effects of price discounts on revenues received from services

    Directory of Open Access Journals (Sweden)

    Rafael Barreiros Porto

    Full Text Available Abstract The relationship between supply and demand generates commercial cycles. Operant behavioral economics explain that these cycles are shaped by three-term bilateral contingencies – situations that create supply and demand responses and which, in turn, generate reinforcing or punitive consequences that can maintain or mitigate these. Research shows how the commercial cycle of a company occurs and investigates how price discounts affect basic and differentiated service revenues according to seasonality. Based on a longitudinal design, two time-series analyses were performed using the ARIMA model, while another was carried out using a Generalized Estimating Equations divided into seasonal combinations. The results show, among other things, (1 that a company handles most of the marketing context strategies and programmed consequences of services used by consumers, creating a new commercial situation for the company, (2 the effects of price discounts on sophisticated services have a positive impact and produce higher revenues during the low season, while those related to basic services have a greater impact and produce greater revenue during the high season; and (3 the seasonality of the greatest purchasing intensity exerts a more positive influence on revenues than the seasonality of demand characterized by heterogeneous reinforcements. These findings are useful for the administration of price discounts to generate maximum revenue and make it possible to have a better understanding of the way the commercial cycle of a company functions.

  6. Coupled optical/thermal/fluid analysis and design requirements for operation and testing of a supercritical CO2 solar receiver.

    Energy Technology Data Exchange (ETDEWEB)

    Khivsara, Sagar [Indian Institute of Science, Bangalor (India)

    2015-01-01

    Recent studies have evaluated closed-loop supercritical carbon dioxide (s-CO2) Brayton cycles to be a higher energy-density system in comparison to conventional superheated steam Rankine systems. At turbine inlet conditions of 923K and 25 MPa, high thermal efficiency (~50%) can be achieved. Achieving these high efficiencies will make concentrating solar power (CSP) technologies a competitive alternative to current power generation methods. To incorporate a s-CO2 Brayton power cycle in a solar power tower system, the development of a solar receiver capable of providing an outlet temperature of 923 K (at 25 MPa) is necessary. To satisfy the temperature requirements of a s-CO2 Brayton cycle with recuperation and recompression, it is required to heat s-CO2 by a temperature of ~200 K as it passes through the solar receiver. Our objective was to develop an optical-thermal-fluid model to design and evaluate a tubular receiver that will receive a heat input ~1 MWth from a heliostat field. We also undertook the documentation of design requirements for the development, testing and safe operation of a direct s-CO2 solar receiver. The main purpose of this document is to serve as a reference and guideline for design and testing requirements, as well as to address the technical challenges and provide initial parameters for the computational models that will be employed for the development of s-CO2 receivers.

  7. Evaluation of hemoglobin performance in the assessment of iron stores in feto-maternal pairs in a high-risk population: receiver operating characteristic curve analysis

    Directory of Open Access Journals (Sweden)

    José Carlos Jaime-Pérez

    2015-06-01

    Full Text Available Objective: By applying receiver operating characteristic curve analysis, the objective of this study was to see whether hemoglobin levels reflect body iron stores in a group of pregnant women at term who, by using serum ferritin as the reference test, had a high pre-test prob- ability of having iron deficiency anemia. Likewise, we evaluated the ability of hemoglobin and maternal serum ferritin levels to predict iron deficiency anemia in newborns. Methods: Hemoglobin and serum ferritin were measured in 187 pregnant women at term belonging to a group with a high pre-test probability of iron deficiency anemia and their newborns. Women with Hb <11.0 g/dL and newborns with cord Hb <13.0 g/dL were classified as anemic. A serum ferritin <12.0 µg/L in women and a cord blood serum ferritin <35.0 µg/L were considered to reflect empty iron stores. Receiver operating characteristic curve analysis was applied to select the cut-off points that better reflected iron stores. Results: The Hb cut-off point selected by receiver operating characteristic curve analysis in women was <11.5 g/dL (sensitivity: 60.82, specificity: 53.33%, Youden Index: 0.450. Most of the newborns had normal Hb which precluded this analysis. Maternal Hb <11.0 g/dL was the cut-off point that best reflected iron deficiency anemia in newborns (sensitivity: 55.88%, specificity: 57.24%, Youden Index: 0.217. The best cut-off point of maternal serum ferritin to reflect empty iron stores in newborns was <6.0 µg/L (sensitivity: 76.47%, specificity: 31.58%, Youden Index: 0.200. Conclusion: Hemoglobin concentration performed poorly to detect iron deficiency anemia in women at term with high risk for iron deficiency and their newborns.

  8. Evaluation of hemoglobin performance in the assessment of iron stores in feto-maternal pairs in a high-risk population: receiver operating characteristic curve analysis

    Science.gov (United States)

    Jaime-Pérez, José Carlos; García-Arellano, Gisela; Méndez-Ramírez, Nereida; González-Llano, Óscar; Gómez-Almaguer, David

    2015-01-01

    Objective By applying receiver operating characteristic curve analysis, the objective of this study was to see whether hemoglobin levels reflect body iron stores in a group of pregnant women at term who, by using serum ferritin as the reference test, had a high pre-test probability of having iron deficiency anemia. Likewise, we evaluated the ability of hemoglobin and maternal serum ferritin levels to predict iron deficiency anemia in newborns. Methods Hemoglobin and serum ferritin were measured in 187 pregnant women at term belonging to a group with a high pre-test probability of iron deficiency anemia and their newborns. Women with Hb <11.0 g/dL and newborns with cord Hb <13.0 g/dL were classified as anemic. A serum ferritin <12.0 μg/L in women and a cord blood serum ferritin <35.0 μg/L were considered to reflect empty iron stores. Receiver operating characteristic curve analysis was applied to select the cut-off points that better reflected iron stores. Results The Hb cut-off point selected by receiver operating characteristic curve analysis in women was <11.5 g/dL (sensitivity: 60.82, specificity: 53.33%, Youden Index: 0.450). Most of the newborns had normal Hb which precluded this analysis. Maternal Hb <11.0 g/dL was the cut-off point that best reflected iron deficiency anemia in newborns (sensitivity: 55.88%, specificity: 57.24%, Youden Index: 0.217). The best cut-off point of maternal serum ferritin to reflect empty iron stores in newborns was <6.0 μg/L (sensitivity: 76.47%, specificity: 31.58%, Youden Index: 0.200). Conclusion Hemoglobin concentration performed poorly to detect iron deficiency anemia in women at term with high risk for iron deficiency and their newborns. PMID:26041420

  9. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  10. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  11. Konstruksi Bayesian Network Dengan Algoritma Bayesian Association Rule Mining Network

    OpenAIRE

    Octavian

    2015-01-01

    Beberapa tahun terakhir, Bayesian Network telah menjadi konsep yang populer digunakan dalam berbagai bidang kehidupan seperti dalam pengambilan sebuah keputusan dan menentukan peluang suatu kejadian dapat terjadi. Sayangnya, pengkonstruksian struktur dari Bayesian Network itu sendiri bukanlah hal yang sederhana. Oleh sebab itu, penelitian ini mencoba memperkenalkan algoritma Bayesian Association Rule Mining Network untuk memudahkan kita dalam mengkonstruksi Bayesian Network berdasarkan data ...

  12. Model Diagnostics for Bayesian Networks

    Science.gov (United States)

    Sinharay, Sandip

    2006-01-01

    Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…

  13. Comparison of a Bayesian Network with a Logistic Regression Model to Forecast IgA Nephropathy

    Directory of Open Access Journals (Sweden)

    Michel Ducher

    2013-01-01

    Full Text Available Models are increasingly used in clinical practice to improve the accuracy of diagnosis. The aim of our work was to compare a Bayesian network to logistic regression to forecast IgA nephropathy (IgAN from simple clinical and biological criteria. Retrospectively, we pooled the results of all biopsies (n=155 performed by nephrologists in a specialist clinical facility between 2002 and 2009. Two groups were constituted at random. The first subgroup was used to determine the parameters of the models adjusted to data by logistic regression or Bayesian network, and the second was used to compare the performances of the models using receiver operating characteristics (ROC curves. IgAN was found (on pathology in 44 patients. Areas under the ROC curves provided by both methods were highly significant but not different from each other. Based on the highest Youden indices, sensitivity reached (100% versus 67% and specificity (73% versus 95% using the Bayesian network and logistic regression, respectively. A Bayesian network is at least as efficient as logistic regression to estimate the probability of a patient suffering IgAN, using simple clinical and biological data obtained during consultation.

  14. Comparison of a Bayesian network with a logistic regression model to forecast IgA nephropathy.

    Science.gov (United States)

    Ducher, Michel; Kalbacher, Emilie; Combarnous, François; Finaz de Vilaine, Jérome; McGregor, Brigitte; Fouque, Denis; Fauvel, Jean Pierre

    2013-01-01

    Models are increasingly used in clinical practice to improve the accuracy of diagnosis. The aim of our work was to compare a Bayesian network to logistic regression to forecast IgA nephropathy (IgAN) from simple clinical and biological criteria. Retrospectively, we pooled the results of all biopsies (n = 155) performed by nephrologists in a specialist clinical facility between 2002 and 2009. Two groups were constituted at random. The first subgroup was used to determine the parameters of the models adjusted to data by logistic regression or Bayesian network, and the second was used to compare the performances of the models using receiver operating characteristics (ROC) curves. IgAN was found (on pathology) in 44 patients. Areas under the ROC curves provided by both methods were highly significant but not different from each other. Based on the highest Youden indices, sensitivity reached (100% versus 67%) and specificity (73% versus 95%) using the Bayesian network and logistic regression, respectively. A Bayesian network is at least as efficient as logistic regression to estimate the probability of a patient suffering IgAN, using simple clinical and biological data obtained during consultation.

  15. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2017-04-12

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  16. Using a Bayesian Probabilistic Forecasting Model to Analyze the Uncertainty in Real-Time Dynamic Control of the Flood Limiting Water Level for Reservoir Operation

    DEFF Research Database (Denmark)

    Liu, Dedi; Li, Xiang; Guo, Shenglian;

    2015-01-01

    inflow values and their uncertainties obtained from the BFS, the reservoir operation results from different schemes can be analyzed in terms of benefits, dam safety, and downstream impacts during the flood season. When the reservoir FLWL dynamic control operation is implemented, there are two fundamental...

  17. Bayesian psychometric scaling

    NARCIS (Netherlands)

    Fox, G.J.A.; Berg, van den S.M.; Veldkamp, B.P.; Irwing, P.; Booth, T.; Hughes, D.

    2015-01-01

    In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item resp

  18. Bayesian psychometric scaling

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; van den Berg, Stéphanie Martine; Veldkamp, Bernard P.; Irwing, P.; Booth, T.; Hughes, D.

    2015-01-01

    In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item

  19. Practical Bayesian Tomography

    CERN Document Server

    Granade, Christopher; Cory, D G

    2015-01-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  20. Usefulness of pinhole collimator in differential diagnosis of metastatic disease and degenerative joint disease in the vertebrae; Evaluation by receiver operating characteristics (ROC) analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kosuda, Shigeru; Kawahara, Syunji; Ishibashi, Akihiko; Tamura, Kohei; Tsukatani, Yasushi; Fujii, Hiroshi (Okura National Hospital, Tokyo (Japan)); Kubo, Atsushi; Hashimoto, Shozo

    1989-11-01

    In order to evaluate the diagnostic efficacy of pinhole collimator (PHC) imaging combined with an X-ray for vertebral metastasis, our prospective study has employed receiver operating characteristics (ROC) analysis in 21 patients, 11 with osseous metastasis and 15 with degenerative joint disease in the lumbar vertebrae. PHC imaging provided better anatomic information on the extent of {sup 99m}Tc-MDP accumulation. PHC vertebral scintigraphy had a considerable impact on the decision-making process, although with variations and not very satisfactory results among the physicians with little experience. Our study suggests that PHC imaging and X-ray film are useful in differentiating between osseous metastasis and degenerative joint disease in the vertebra. (author).

  1. Diagnostic sensitivity of serum carcinoembryonic antigen, carbohydrate antigen 19-9, alpha-fetoprotein, and beta-human chorionic gonadotropin in esophageal carcinoma (receiver operating characteristic curve analysis

    Directory of Open Access Journals (Sweden)

    Bhawna Bagaria

    2015-01-01

    Full Text Available Background: Esophageal carcinomas are very lethal disease relatively unresponsive to therapy. The continued development of new and more effective chemotherapeutic agents and regimens offers hope that in the future, this carcinoma may be amenable to either more effective palliative treatment or possibly increased cure. We, therefore, aimed to evaluate the marker with best diagnostic sensitivity in esophageal carcinoma. Materials and Methods: Serum carcinoembryonic antigen (CEA, carbohydrate antigen 19-9 (CA19-9, alpha-fetoprotein (AFP, and beta-human chorionic gonadotropin (β-HCG levels were assessed in healthy subjects (n = 50 and patients (n = 50 initially diagnosed of esophageal carcinoma by endoscopic examination and biopsy before receiving any therapy. The data were analyzed using SPSS software version 10.0 (SPSS Inc. USA and MedCalc to estimate mean ± standard deviation, the significance of the observed differences (P value, for calculating sensitivity and for plotting receiver operating characteristic curves. Results: Sensitivity of CEA, CA19-9, AFP, and β-HCG detected in esophagus cancer was 38%, 18%, 10%, and 26% respectively. Conclusion: From the above studied markers, CEA has the highest sensitivity followed by β-HCG, CA19-9 and AFP. Although the sensitivity of tumor markers in esophagus cancer is low, they may be useful additional parameter in the prediction of neoplasms involved at the early stage of tumor growth.

  2. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  3. Climate information based streamflow and rainfall forecasts for Huai River Basin using Hierarchical Bayesian Modeling

    Directory of Open Access Journals (Sweden)

    X. Chen

    2013-09-01

    Full Text Available A Hierarchal Bayesian model for forecasting regional summer rainfall and streamflow season-ahead using exogenous climate variables for East Central China is presented. The model provides estimates of the posterior forecasted probability distribution for 12 rainfall and 2 streamflow stations considering parameter uncertainty, and cross-site correlation. The model has a multilevel structure with regression coefficients modeled from a common multivariate normal distribution results in partial-pooling of information across multiple stations and better representation of parameter and posterior distribution uncertainty. Covariance structure of the residuals across stations is explicitly modeled. Model performance is tested under leave-10-out cross-validation. Frequentist and Bayesian performance metrics used include Receiver Operating Characteristic, Reduction of Error, Coefficient of Efficiency, Rank Probability Skill Scores, and coverage by posterior credible intervals. The ability of the model to reliably forecast regional summer rainfall and streamflow season-ahead offers potential for developing adaptive water risk management strategies.

  4. Bayesian Methods and Universal Darwinism

    Science.gov (United States)

    Campbell, John

    2009-12-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.

  5. Universal Darwinism as a process of Bayesian inference

    Directory of Open Access Journals (Sweden)

    John Oberon Campbell

    2016-06-01

    Full Text Available Many of the mathematical frameworks describing natural selection are equivalent to Bayes’ Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians. As Bayesian inference can always be cast in terms of (variational free energy minimization, natural selection can be viewed as comprising two components: a generative model of an ‘experiment’ in the external world environment, and the results of that 'experiment' or the 'surprise' entailed by predicted and actual outcomes of the ‘experiment’. Minimization of free energy implies that the implicit measure of 'surprise' experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  6. Universal Darwinism As a Process of Bayesian Inference.

    Science.gov (United States)

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  7. Multiple Antenna Cognitive Receivers and Signal Detection

    CERN Document Server

    Couillet, Romain

    2008-01-01

    A Bayesian inference learning process for cognitive receivers is provided in this paper. We focus on the particular case of signal detectionas an explanatory example to the learning framework. Under any prior state of knowledge on the communication channel, an information theoretic criterion is presented to decide on the presence of informative data in a noisy wireless MIMO communication. We detail the particular cases of knowledge, or absence of knowledge at the receiver, of the number of transmit antennas and noise power. The provided method is instrumental to provide intelligence to the receiver and gives birth to a novel Bayesian signal detector. The detector is compared to the classical power detector and provides detection performance upper bounds. Simulations corroborate the theoretical results and quantify the gain achieved using the proposed Bayesian framework.

  8. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  9. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Science.gov (United States)

    Jones, Matt; Love, Bradley C

    2011-08-01

    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  10. Assessment of debris flow hazards using a Bayesian Network

    Science.gov (United States)

    Liang, Wan-jie; Zhuang, Da-fang; Jiang, Dong; Pan, Jian-jun; Ren, Hong-yan

    2012-10-01

    Comprehensive assessment of debris flow hazard risk is challenging due to the complexity and uncertainties of various related factors. A reasonable and reliable assessment should be based on sufficient data and realistic approaches. This study presents a novel approach for assessing debris flow hazard risk using BN (Bayesian Network) and domain knowledge. Based on the records of debris flow hazards and geomorphological/environmental data for the Chinese mainland, approaches based on BN, SVM (Support Vector Machine) and ANN (Artificial Neural Network) were compared. BN provided the highest values of hazard detection probability, precision, and AUC (area under the receiver operating characteristic curve). The BN model is useful for mapping and assessing debris flow hazard risk on a national scale.

  11. Bayesian Face Sketch Synthesis.

    Science.gov (United States)

    Wang, Nannan; Gao, Xinbo; Sun, Leiyu; Li, Jie

    2017-03-01

    Exemplar-based face sketch synthesis has been widely applied to both digital entertainment and law enforcement. In this paper, we propose a Bayesian framework for face sketch synthesis, which provides a systematic interpretation for understanding the common properties and intrinsic difference in different methods from the perspective of probabilistic graphical models. The proposed Bayesian framework consists of two parts: the neighbor selection model and the weight computation model. Within the proposed framework, we further propose a Bayesian face sketch synthesis method. The essential rationale behind the proposed Bayesian method is that we take the spatial neighboring constraint between adjacent image patches into consideration for both aforementioned models, while the state-of-the-art methods neglect the constraint either in the neighbor selection model or in the weight computation model. Extensive experiments on the Chinese University of Hong Kong face sketch database demonstrate that the proposed Bayesian method could achieve superior performance compared with the state-of-the-art methods in terms of both subjective perceptions and objective evaluations.

  12. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  13. Development of a prognostic naive bayesian classifier for successful treatment of nonunions.

    Science.gov (United States)

    Stojadinovic, Alexander; Kyle Potter, Benjamin; Eberhardt, John; Shawen, Scott B; Andersen, Romney C; Forsberg, Jonathan A; Shwery, Clay; Ester, Eric A; Schaden, Wolfgang

    2011-01-19

    predictive models permitting individualized prognostication for patients with fracture nonunion are lacking. The objective of this study was to train, test, and cross-validate a Bayesian classifier for predicting fracture-nonunion healing in a population treated with extracorporeal shock wave therapy. prospectively collected data from 349 patients with delayed fracture union or a nonunion were utilized to develop a naïve Bayesian belief network model to estimate site-specific fracture-nonunion healing in patients treated with extracorporeal shock wave therapy. Receiver operating characteristic curve analysis and tenfold cross-validation of the model were used to determine the clinical utility of the approach. predictors of fracture-healing at six months following shock wave treatment were the time between the fracture and the first shock wave treatment, the time between the fracture and the surgery, intramedullary stabilization, the number of bone-grafting procedures, the number of extracorporeal shock wave therapy treatments, work-related injury, and the bone involved (p < 0.05 for all comparisons). These variables were all included in the naïve Bayesian belief network model. a clinically relevant Bayesian classifier was developed to predict the outcome after extracorporeal shock wave therapy for fracture nonunions. The time to treatment and the anatomic site of the fracture nonunion significantly impacted healing outcomes. Although this study population was restricted to patients treated with shock wave therapy, Bayesian-derived predictive models may be developed for application to other fracture populations at risk for nonunion. prognostic Level II. See Instructions to Authors for a complete description of levels of evidence.

  14. Bayesian Visual Odometry

    Science.gov (United States)

    Center, Julian L.; Knuth, Kevin H.

    2011-03-01

    Visual odometry refers to tracking the motion of a body using an onboard vision system. Practical visual odometry systems combine the complementary accuracy characteristics of vision and inertial measurement units. The Mars Exploration Rovers, Spirit and Opportunity, used this type of visual odometry. The visual odometry algorithms in Spirit and Opportunity were based on Bayesian methods, but a number of simplifying approximations were needed to deal with onboard computer limitations. Furthermore, the allowable motion of the rover had to be severely limited so that computations could keep up. Recent advances in computer technology make it feasible to implement a fully Bayesian approach to visual odometry. This approach combines dense stereo vision, dense optical flow, and inertial measurements. As with all true Bayesian methods, it also determines error bars for all estimates. This approach also offers the possibility of using Micro-Electro Mechanical Systems (MEMS) inertial components, which are more economical, weigh less, and consume less power than conventional inertial components.

  15. Hybrid Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2012-01-01

    Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...

  16. Bayesian least squares deconvolution

    Science.gov (United States)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  17. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  18. Bayesian least squares deconvolution

    CERN Document Server

    Ramos, A Asensio

    2015-01-01

    Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  19. Perspective Biological Markers for Autism Spectrum Disorders: Advantages of the Use of Receiver Operating Characteristic Curves in Evaluating Marker Sensitivity and Specificity

    Directory of Open Access Journals (Sweden)

    Provvidenza M. Abruzzo

    2015-01-01

    Full Text Available Autism Spectrum Disorders (ASD are a heterogeneous group of neurodevelopmental disorders. Recognized causes of ASD include genetic factors, metabolic diseases, toxic and environmental factors, and a combination of these. Available tests fail to recognize genetic abnormalities in about 70% of ASD children, where diagnosis is solely based on behavioral signs and symptoms, which are difficult to evaluate in very young children. Although it is advisable that specific psychotherapeutic and pedagogic interventions are initiated as early as possible, early diagnosis is hampered by the lack of nongenetic specific biological markers. In the past ten years, the scientific literature has reported dozens of neurophysiological and biochemical alterations in ASD children; however no real biomarker has emerged. Such literature is here reviewed in the light of Receiver Operating Characteristic (ROC analysis, a very valuable statistical tool, which evaluates the sensitivity and the specificity of biomarkers to be used in diagnostic decision making. We also apply ROC analysis to some of our previously published data and discuss the increased diagnostic value of combining more variables in one ROC curve analysis. We also discuss the use of biomarkers as a tool for advancing our understanding of nonsyndromic ASD.

  20. Assessing the predictive performance of risk-based water quality criteria using decision error estimates from receiver operating characteristics (ROC) analysis.

    Science.gov (United States)

    McLaughlin, Douglas B

    2012-10-01

    Field data relating aquatic ecosystem responses with water quality constituents that are potential ecosystem stressors are being used increasingly in the United States in the derivation of water quality criteria to protect aquatic life. In light of this trend, there is a need for transparent quantitative methods to assess the performance of models that predict ecological conditions using a stressor-response relationship, a response variable threshold, and a stressor variable criterion. Analysis of receiver operating characteristics (ROC analysis) has a considerable history of successful use in medical diagnostic, industrial, and other fields for similarly structured decision problems, but its use for informing water quality management decisions involving risk-based environmental criteria is less common. In this article, ROC analysis is used to evaluate predictions of ecological response variable status for 3 water quality stressor-response data sets. Information on error rates is emphasized due in part to their common use in environmental studies to describe uncertainty. One data set is comprised of simulated data, and 2 involve field measurements described previously in the literature. These data sets are also analyzed using linear regression and conditional probability analysis for comparison. Results indicate that of the methods studied, ROC analysis provides the most comprehensive characterization of prediction error rates including false positive, false negative, positive predictive, and negative predictive errors. This information may be used along with other data analysis procedures to set quality objectives for and assess the predictive performance of risk-based criteria to support water quality management decisions.

  1. [Comparison of LCD and CRT monitors for detection of pulmonary nodules and interstitial lung diseases on digital chest radiographs by using receiver operating characteristic analysis].

    Science.gov (United States)

    Ikeda, Ryuji; Katsuragawa, Shigehiko; Shimonobou, Toshiaki; Hiai, Yasuhiro; Hashida, Masahiro; Awai, Kazuo; Yamashita, Yasuyuki; Doi, Kunio

    2006-05-20

    Soft copy reading of digital images has been practiced commonly in the PACS environment. In this study, we compared liquid-crystal display (LCD) and cathode-ray tube (CRT) monitors for detection of pulmonary nodules and interstitial lung diseases on digital chest radiographs by using receiver operating characteristic (ROC) analysis. Digital chest images with a 1000x1000 matrix size and a 8 bit grayscale were displayed on LCD/CRT monitor with 2M pixels in each observer test. Eight and ten radiologists participated in the observer tests for detection of nodules and interstitial diseases, respectively. In each observer test, radiologists marked their confidence levels for diagnosis of pulmonary nodules or interstitial diseases. The detection performance of radiologists was evaluated by ROC analyses. The average Az values (area under the ROC curve) in detecting pulmonary nodules with LCD and CRT monitors were 0.792 and 0.814, respectively. In addition, the average Az values in detecting interstitial diseases with LCD and CRT monitors were 0.951 and 0.953, respectively. There was no statistically significant difference between LCD and CRT for both detection of pulmonary nodules (P=0.522) and interstitial lung diseases (P=0.869). Therefore, we believe that the LCD monitor instead of the CRT monitor can be used for the diagnosis of pulmonary nodules and interstitial lung diseases in digital chest images.

  2. A receiver operating characteristic analysis approach for the assessment of the separation of female Mediterranean fruit fly (Diptera: Tephritidae) oviposition distributions.

    Science.gov (United States)

    Alonzo, Todd A; Nakas, Christos T; Papadopoulos, Nikos T; Papachristos, Dimitrios P

    2009-10-01

    Average fecundity rates and survival are the main components of fitness estimates in studies comparing performance of insect populations. Reproduction is inherently age related in most insect species, and age-specific offspring production is very important in determining fitness components. However, comparison of age-specific reproduction rates are not straight forward and most studies limit analyses to comparisons of average fecundity rates and survival as the main components of the performance of insect populations. We develop a receiver operating characteristic (ROC) curve approach to compare lifetime oviposition distributions. We use empirical data of a study of Mediterranean fruit fly, Ceratitis capitata (Wiedemann) (Diptera: Tephritidae) populations, where each fly's lifetime oviposition distribution was recorded for samples studied in natural and artificial oviposition substrates. Currently, there exists no routinely used methodology for the comparison of oviposition distributions and assessment of their separation. ROC analysis is regularly used in two-sample problems in medical biostatistics when the main task is depiction and quantification of the separation of the empirical distributions from which the data arise. Adaptation of such an analysis to our data has shown that age-specific egg-laying distributions can differ, whereas average fecundity rates do not. Therefore, ROC analysis provides a method of gaining insight in the biological process of egg-laying patterns in relatively long-lived insects with many practical and theoretical implications in entomological experimentation.

  3. Disadvantages of using the area under the receiver operating characteristic curve to assess imaging tests: A discussion and proposal for an alternative approach

    Energy Technology Data Exchange (ETDEWEB)

    Halligan, Steve [University College London, Centre for Medical Imaging, University College Hospital, London (United Kingdom); Altman, Douglas G. [University of Oxford, Centre for Statistics in Medicine, Oxford (United Kingdom); Mallett, Susan [University of Oxford, Department of Primary Care Health Sciences, Oxford (United Kingdom)

    2015-04-01

    The objectives are to describe the disadvantages of the area under the receiver operating characteristic curve (ROC AUC) to measure diagnostic test performance and to propose an alternative based on net benefit. We use a narrative review supplemented by data from a study of computer-assisted detection for CT colonography. We identified problems with ROC AUC. Confidence scoring by readers was highly non-normal, and score distribution was bimodal. Consequently, ROC curves were highly extrapolated with AUC mostly dependent on areas without patient data. AUC depended on the method used for curve fitting. ROC AUC does not account for prevalence or different misclassification costs arising from false-negative and false-positive diagnoses. Change in ROC AUC has little direct clinical meaning for clinicians. An alternative analysis based on net benefit is proposed, based on the change in sensitivity and specificity at clinically relevant thresholds. Net benefit incorporates estimates of prevalence and misclassification costs, and it is clinically interpretable since it reflects changes in correct and incorrect diagnoses when a new diagnostic test is introduced. ROC AUC is most useful in the early stages of test assessment whereas methods based on net benefit are more useful to assess radiological tests where the clinical context is known. Net benefit is more useful for assessing clinical impact. (orig.)

  4. Prediction of Abdominal Visceral Obesity From Body Mass Index,Waist Circumference and Waist-hip Ratio in Chinese Adults:Receiver Operating Characteristic Curves Analysis

    Institute of Scientific and Technical Information of China (English)

    WEI-PING JIA; JUN-XI LU; KUN-SAN XIANG; YU-QIAN BAO; HUI-JUAN LU; LEI CHEN

    2003-01-01

    To evaluate the sensitivity and specificity of body mass index (BMI), waist circumference (WC) and waist-to-hip ratio (WHR) measurements in diagnosing abdominal visceral obesity. Methods BMI, WC, and WHR were assessed in 690 Chinese adults (305 men and 385women) and compared with magnetic resonance imaging (MRI) measurements of abdominal visceral adipose tissue (VA). Receiver operating characteristic (ROC) curves were generated and used to determine the threshold point for each anthropometric parameter. Results 1) MRI showed that 61.7% of overweight/obese individuals (BMI≥25 kg/m2) and 14.2% of normal weight (BMI<25kg/m2) individuals had abdominal visceral obesity (VA≥ 100 cm2). 2) VA was positively correlated with each anthropometric variable, of which WC showed the highest correlation (r=0.73-0.77,P<0.001 ). 3) The best cut-off points for assessing abdominal visceral obesity were as followed: BMI of 26 kg/m2, WC of 90 cm, and WHR of 0.93, with WC being the most sensitive and specific factor. 4)Among subjects with BMI≥28 kg/m2 or WC≥95 cm, 95% of men and 90% of women appeared to have abdominal visceral obesity. Conclusion Measurements of BMI, WC, and WHR can be used in the prediction of abdominal visceral obesity, of which WC was the one with better accuracy.

  5. Effect of non-linearity of a predictor on the shape and magnitude of its receiver-operating-characteristic curve in predicting a binary outcome.

    Science.gov (United States)

    Ho, Kwok M

    2017-08-31

    Area under a receiver-operating-characteristic (AUROC) curve is widely used in medicine to summarize the ability of a continuous predictive marker to predict a binary outcome. This study illustrated how a U-shaped or inverted U-shaped continuous predictor would affect the shape and magnitude of its AUROC curve in predicting a binary outcome by comparing the ROC curves of the worst first 24-hour arterial pH values of 9549 consecutive critically ill patients in predicting hospital mortality before and after centering the predictor by its mean or median. A simulation dataset with an inverted U-shaped predictor was used to assess how this would affect the shape and magnitude of the AUROC curve. An asymmetrical U-shaped relationship between pH and hospital mortality, resulting in an inverse-sigmoidal ROC curve, was observed. The AUROC substantially increased after centering the predictor by its mean (0.611 vs 0.722, difference = 0.111, 95% confidence interval [CI] 0.087-0.135), and was further improved after centering by its median (0.611 vs 0.745, difference = 0.133, 95%CI 0.110-0.157). A sigmoidal-shaped ROC curve was observed for an inverted U-shaped predictor. In summary, a non-linear predictor can result in a biphasic-shaped ROC curve; and centering the predictor can reduce its bias towards null predictive ability.

  6. Probabilistic Inferences in Bayesian Networks

    OpenAIRE

    Ding, Jianguo

    2010-01-01

    This chapter summarizes the popular inferences methods in Bayesian networks. The results demonstrates that the evidence can propagated across the Bayesian networks by any links, whatever it is forward or backward or intercausal style. The belief updating of Bayesian networks can be obtained by various available inference techniques. Theoretically, exact inferences in Bayesian networks is feasible and manageable. However, the computing and inference is NP-hard. That means, in applications, in ...

  7. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  8. 企业运行指标因果分析的动态贝叶斯网络方法%Dynamic Bayesian network method for causal analysis between enterprise operation indexes

    Institute of Scientific and Technical Information of China (English)

    高瑞; 王双成; 杜瑞杰

    2016-01-01

    针对现有的企业运行指标分析方法只强调动态或静态信息,不易实现二者结合的情况,建立了用于企业运行指标因果分析的动态贝叶斯网络模型,这种模型可将时间片间的指标动态时序因果关系与时间片内指标静态因果联系融为一体,并通过量化推理进行动态与静态因果分析。通过与领域专家交流,所建立的企业运行指标动态贝叶斯网络良好地反映了数据中所蕴涵的因果关系。%In the light of those methods now available for analyzing enterprises operation indexes are emphasizing only dynamic or static information,and have not realized the combinations between those two kinds of different information.This paper set up a dynamic Bayesian network method for causal analysis among enterprises operation indexes.The model could combine dynamic time sequence and static causal relationships of panel data as a whole,could analyze both dynamic and static causal relationships through quantitative inference without the assumptions of liner causal relationships.Communicating with the ex-perts in the relative field,the model can primely be used to analyze multi-variables dynamic causal relationships contained in the data.

  9. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  10. The Bayesian Inventory Problem

    Science.gov (United States)

    1984-05-01

    Bayesian Approach to Demand Estimation and Inventory Provisioning," Naval Research Logistics Quarterly. Vol 20, 1973, (p607-624). 4 DeGroot , Morris H...page is blank APPENDIX A SUFFICIENT STATISTICS A convenient reference for moat of this material is DeGroot (41. Su-pose that we are sampling from a

  11. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  12. Robust Medical Test Evaluation Using Flexible Bayesian Semiparametric Regression Models

    Directory of Open Access Journals (Sweden)

    Adam J. Branscum

    2013-01-01

    Full Text Available The application of Bayesian methods is increasing in modern epidemiology. Although parametric Bayesian analysis has penetrated the population health sciences, flexible nonparametric Bayesian methods have received less attention. A goal in nonparametric Bayesian analysis is to estimate unknown functions (e.g., density or distribution functions rather than scalar parameters (e.g., means or proportions. For instance, ROC curves are obtained from the distribution functions corresponding to continuous biomarker data taken from healthy and diseased populations. Standard parametric approaches to Bayesian analysis involve distributions with a small number of parameters, where the prior specification is relatively straight forward. In the nonparametric Bayesian case, the prior is placed on an infinite dimensional space of all distributions, which requires special methods. A popular approach to nonparametric Bayesian analysis that involves Polya tree prior distributions is described. We provide example code to illustrate how models that contain Polya tree priors can be fit using SAS software. The methods are used to evaluate the covariate-specific accuracy of the biomarker, soluble epidermal growth factor receptor, for discerning lung cancer cases from controls using a flexible ROC regression modeling framework. The application highlights the usefulness of flexible models over a standard parametric method for estimating ROC curves.

  13. A receiver operated curve-based evaluation of change in sensitivity and specificity of cotinine urinalysis for detecting active tobacco use

    Directory of Open Access Journals (Sweden)

    Yatan Pal Singh Balhara

    2013-01-01

    Full Text Available Background: Tobacco use has been associated with various carcinomas including lung, esophagus, larynx, mouth, throat, kidney, bladder, pancreas, stomach, and cervix. Biomarkers such as concentration of cotinine in the blood, urine, or saliva have been used as objective measures to distinguish nonusers and users of tobacco products. A change in the cut-off value of urinary cotinine to detect active tobacco use is associated with a change in sensitivity and sensitivity of detection. Aim: The current study aimed at assessing the impact of using different cut-off thresholds of urinary cotinine on sensitivity and specificity of detection of smoking and smokeless tobacco product use among psychiatric patients. Settings and Design: All the male subjects attending the psychiatry out-patient department of the tertiary care multispecialty teaching hospital constituted the sample frame for the current study in a cross-sectionally. Materials and Methods: Quantitative urinary cotinine assay was done by using ELISA kits of Calbiotech. Inc., USA. We used the receiver operating characteristic (ROC curve to assess the sensitivity and specificity of various cut-off values of urinary cotinine to identify active smokers and users of smokeless tobacco products. Results: ROC analysis of urinary cotinine levels in detection of self-reported smoking provided the area under curve (AUC of 0.434. Similarly, the ROC analysis of urinary cotinine levels in detection of self-reported smoking revealed AUC of 0.44. The highest sensitivity and specificity of 100% for smoking were detected at the urinary cut-off value greater than or equal to 2.47 ng/ml. Conclusions: The choice of cut-off value of urinary cotinine used to distinguish nonusers form active users of tobacco products impacts the sensitivity as well as specificity of detection.

  14. Delineating a Retesting Zone Using Receiver Operating Characteristic Analysis on Serial QuantiFERON Tuberculosis Test Results in US Healthcare Workers

    Directory of Open Access Journals (Sweden)

    Wendy Thanassi

    2012-01-01

    Full Text Available Objective. To find a statistically significant separation point for the QuantiFERON Gold In-Tube (QFT interferon gamma release assay that could define an optimal “retesting zone” for use in serially tested low-risk populations who have test “reversions” from initially positive to subsequently negative results. Method. Using receiver operating characteristic analysis (ROC to analyze retrospective data collected from 3 major hospitals, we searched for predictors of reversion until statistically significant separation points were revealed. A confirmatory regression analysis was performed on an additional sample. Results. In 575 initially positive US healthcare workers (HCWs, 300 (52.2% had reversions, while 275 (47.8% had two sequential positive tests. The most statistically significant (Kappa = 0.48, chi-square = 131.0, P<0.001 separation point identified by the ROC for predicting reversion was the tuberculosis antigen minus-nil (TBag-nil value at 1.11 International Units per milliliter (IU/mL. The second separation point was found at TBag-nil at 0.72 IU/mL (Kappa = 0.16, chi-square = 8.2, P<0.01. The model was validated by the regression analysis of 287 HCWs. Conclusion. Reversion likelihood increases as the TBag-nil approaches the manufacturer's cut-point of 0.35 IU/mL. The most statistically significant separation point between those who test repeatedly positive and those who revert is 1.11 IU/mL. Clinicians should retest low-risk individuals with initial QFT results < 1.11 IU/mL.

  15. Evaluation of the image quality of ink-jet printed paper copies of digital chest radiographs as compared with film: a receiver operating characteristic study.

    Science.gov (United States)

    Lyttkens, K; Kirkhorn, T; Kehler, M; Andersson, B; Ebbesen, A; Hochbergs, P; Jarlman, O; Lindberg, C G; Holmer, N G

    1994-05-01

    Paper copies of digital radiographs printed with the continuous ink-jet technique have proved to be of a high enough quality for demonstration purposes. We present a study on the image quality of ink-jet printed paper copies of digital chest radiographs, based on receiver operating characteristic (ROC) analysis. Eighty-three digital radiographs of a chest phantom with simulated tumors in the mediastinum and right lung, derived from a computed radiography (CR) system were presented in two series of hard copies as ink-jet printed paper copies and as laser recorded film. The images, with a matrix of 1,760 x 2,140 pixels, were printed with a spatial resolution of 10 pixels/mm in the CR film recorder as well as in the ink-jet printer. On film, every image was recorded in two versions, one optimized for the mediastinum and one for the lungs. On paper, only one image was printed; this constituted an effort to optimize both the mediastinum and the lungs. The ink-jet printed images, printed on a matt coated paper, were viewed as on-sight images with reflected light. The examinations were reviewed by six radiologists, and ROC curves were constructed. No significant difference was found between the performance of film and that of ink-jet paper prints. Because the cost for a paper copy is only a tenth of that of film, remarkable cost reductions can be achieved by using the ink jet technique instead. Our results show that further quality studies of ink-jet printed images are worthwhile.

  16. Reliability of overbite depth indicator (ODI and anteroposterior dysplasia indicator (APDI in the assessment of different vertical and sagittal dental malocclusions: a receiver operating characteristic (ROC analysis

    Directory of Open Access Journals (Sweden)

    Farheen Fatima

    Full Text Available ABSTRACT Introduction: Differential diagnosis of skeletal and dental relationships is crucial for planning orthodontic treatment. Overbite depth indicator (ODI and anteroposterior dysplasia indicator (APDI had been introduced in the past for assessment of vertical and sagittal jaw relationships, respectively. Objective: The objectives of this study were to evaluate the reliability of ODI and APDI in overbite and Angle malocclusions, as well as assess their diagnostic reliability among males and females of different age groups. Material and Methods: This study was conducted using pretreatment dental casts and lateral cephalograms of 90 subjects. For ODI, subjects were divided into three groups based on overbite (normal overbite, open bite and deep bite. Likewise, the same subjects were divided for APDI into three groups, based on Angle's malocclusion classification (dental Class I, II and III malocclusions. Mann-Whitney U test was applied for comparison of study parameters regarding sex and different age groups. The mean values of ODI and APDI were compared among study groups by means of Kruskal-Wallis and post-hoc Dunnet T3 tests. The receiver operating characteristic (ROC curve was applied to test diagnostic reliability. Results: Insignificant differences were found for ODI and APDI angles, particularly in regards to sex and age. Significant intergroup differences were found in different overbite groups and Angle's classification for ODI and APDI, respectively (p < 0.001. ROC showed 91% and 88% constancy with dental pattern in ODI and APDI, respectively. Conclusions: ODI can reliably differentiate deep bite versus normal overbite and deep bite versus open bite. APDI can reliably differentiate dental Class I, II and III malocclusions.

  17. Supporting scalable Bayesian networks using configurable discretizer actuators

    CSIR Research Space (South Africa)

    Osunmakinde, I

    2009-04-01

    Full Text Available Bayesian Networks using Configurable Discretizer Actuators Isaac Osunmakinde, SMIEEE and Antoine Bagula Department of Computer Science, Faculty of Sciences, University of Cape Town, 18 University Avenue, Rhodes Gift, 7707 Rondebosch, Cape Town, South... and limited memory space may crash during these operations. This affects business and research deliveries, and may hinder the growing usage of Bayesian networks in industries that keep massive datasets to build intelligent systems. From our practical...

  18. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M.

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  19. Quantum Bayesianism at the Perimeter

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.

  20. Bayesian Inference in Queueing Networks

    CERN Document Server

    Sutton, Charles

    2010-01-01

    Modern Web services, such as those at Google, Yahoo!, and Amazon, handle billions of requests per day on clusters of thousands of computers. Because these services operate under strict performance requirements, a statistical understanding of their performance is of great practical interest. Such services are modeled by networks of queues, where one queue models each of the individual computers in the system. A key challenge is that the data is incomplete, because recording detailed information about every request to a heavily used system can require unacceptable overhead. In this paper we develop a Bayesian perspective on queueing models in which the arrival and departure times that are not observed are treated as latent variables. Underlying this viewpoint is the observation that a queueing model defines a deterministic transformation between the data and a set of independent variables called the service times. With this viewpoint in hand, we sample from the posterior distribution over missing data and model...

  1. Narrowband interference parameterization for sparse Bayesian recovery

    KAUST Repository

    Ali, Anum

    2015-09-11

    This paper addresses the problem of narrowband interference (NBI) in SC-FDMA systems by using tools from compressed sensing and stochastic geometry. The proposed NBI cancellation scheme exploits the frequency domain sparsity of the unknown signal and adopts a Bayesian sparse recovery procedure. This is done by keeping a few randomly chosen sub-carriers data free to sense the NBI signal at the receiver. As Bayesian recovery requires knowledge of some NBI parameters (i.e., mean, variance and sparsity rate), we use tools from stochastic geometry to obtain analytical expressions for the required parameters. Our simulation results validate the analysis and depict suitability of the proposed recovery method for NBI mitigation. © 2015 IEEE.

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....

  4. Introduction to Bayesian statistics

    CERN Document Server

    Koch, Karl-Rudolf

    2007-01-01

    This book presents Bayes' theorem, the estimation of unknown parameters, the determination of confidence regions and the derivation of tests of hypotheses for the unknown parameters. It does so in a simple manner that is easy to comprehend. The book compares traditional and Bayesian methods with the rules of probability presented in a logical way allowing an intuitive understanding of random variables and their probability distributions to be formed.

  5. Bayesian Watermark Attacks

    OpenAIRE

    Shterev, Ivo; Dunson, David

    2012-01-01

    This paper presents an application of statistical machine learning to the field of watermarking. We propose a new attack model on additive spread-spectrum watermarking systems. The proposed attack is based on Bayesian statistics. We consider the scenario in which a watermark signal is repeatedly embedded in specific, possibly chosen based on a secret message bitstream, segments (signals) of the host data. The host signal can represent a patch of pixels from an image or a video frame. We propo...

  6. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...... economics, with careful controls for the confounding effects of risk aversion. Our results show that risk aversion significantly alters inferences on deviations from Bayes’ Rule....

  7. Reduced Bayesian Inversion

    OpenAIRE

    Himpe, Christian; Ohlberger, Mario

    2014-01-01

    Bayesian inversion of models with large state and parameter spaces proves to be computationally complex. A combined state and parameter reduction can significantly decrease the computational time and cost required for the parameter estimation. The presented technique is based on the well-known balanced truncation approach. Classically, the balancing of the controllability and observability gramians allows a truncation of discardable states. Here the underlying model, being a linear or nonline...

  8. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization....

  9. Bayesian Methods for Analyzing Structural Equation Models with Covariates, Interaction, and Quadratic Latent Variables

    Science.gov (United States)

    Lee, Sik-Yum; Song, Xin-Yuan; Tang, Nian-Sheng

    2007-01-01

    The analysis of interaction among latent variables has received much attention. This article introduces a Bayesian approach to analyze a general structural equation model that accommodates the general nonlinear terms of latent variables and covariates. This approach produces a Bayesian estimate that has the same statistical optimal properties as a…

  10. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  11. Wideband CMOS receivers

    CERN Document Server

    Oliveira, Luis

    2015-01-01

    This book demonstrates how to design a wideband receiver operating in current mode, in which the noise and non-linearity are reduced, implemented in a low cost single chip, using standard CMOS technology.  The authors present a solution to remove the transimpedance amplifier (TIA) block and connect directly the mixer’s output to a passive second-order continuous-time Σ∆ analog to digital converter (ADC), which operates in current-mode. These techniques enable the reduction of area, power consumption, and cost in modern CMOS receivers.

  12. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.

    Directory of Open Access Journals (Sweden)

    Rowena Syn Yin Wong

    Full Text Available There are not many studies that attempt to model intensive care unit (ICU risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU.This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV model. Bayesian Markov Chain Monte Carlo (MCMC simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method.The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05 for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study.Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.

  13. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit

    Science.gov (United States)

    Wong, Rowena Syn Yin; Ismail, Noor Azina

    2016-01-01

    Background and Objectives There are not many studies that attempt to model intensive care unit (ICU) risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU. Methods This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV) model. Bayesian Markov Chain Monte Carlo (MCMC) simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method. Results The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS) was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC) values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05) for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study. Conclusion Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of

  14. The NIFTY way of Bayesian signal inference

    Energy Technology Data Exchange (ETDEWEB)

    Selig, Marco, E-mail: mselig@mpa-Garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany, and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D{sup 3}PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.

  15. Quantum-like Representation of Bayesian Updating

    Science.gov (United States)

    Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Khrennikov, Andrei; Basieva, Irina

    2011-03-01

    Recently, applications of quantum mechanics to coginitive psychology have been discussed, see [1]-[11]. It was known that statistical data obtained in some experiments of cognitive psychology cannot be described by classical probability model (Kolmogorov's model) [12]-[15]. Quantum probability is one of the most advanced mathematical models for non-classical probability. In the paper of [11], we proposed a quantum-like model describing decision-making process in a two-player game, where we used the generalized quantum formalism based on lifting of density operators [16]. In this paper, we discuss the quantum-like representation of Bayesian inference, which has been used to calculate probabilities for decision making under uncertainty. The uncertainty is described in the form of quantum superposition, and Bayesian updating is explained as a reduction of state by quantum measurement.

  16. QBism, the Perimeter of Quantum Bayesianism

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    This article summarizes the Quantum Bayesian point of view of quantum mechanics, with special emphasis on the view's outer edges---dubbed QBism. QBism has its roots in personalist Bayesian probability theory, is crucially dependent upon the tools of quantum information theory, and most recently, has set out to investigate whether the physical world might be of a type sketched by some false-started philosophies of 100 years ago (pragmatism, pluralism, nonreductionism, and meliorism). Beyond conceptual issues, work at Perimeter Institute is focused on the hard technical problem of finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when an agent considers gambling on the consequences of...

  17. Software Health Management with Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole; Schumann, JOhann

    2011-01-01

    Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.

  18. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  19. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  20. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...

  1. Implementation of an Adaptive Learning System Using a Bayesian Network

    Science.gov (United States)

    Yasuda, Keiji; Kawashima, Hiroyuki; Hata, Yoko; Kimura, Hiroaki

    2015-01-01

    An adaptive learning system is proposed that incorporates a Bayesian network to efficiently gauge learners' understanding at the course-unit level. Also, learners receive content that is adapted to their measured level of understanding. The system works on an iPad via the Edmodo platform. A field experiment using the system in an elementary school…

  2. Bayesian Approach for Inconsistent Information.

    Science.gov (United States)

    Stein, M; Beer, M; Kreinovich, V

    2013-10-01

    In engineering situations, we usually have a large amount of prior knowledge that needs to be taken into account when processing data. Traditionally, the Bayesian approach is used to process data in the presence of prior knowledge. Sometimes, when we apply the traditional Bayesian techniques to engineering data, we get inconsistencies between the data and prior knowledge. These inconsistencies are usually caused by the fact that in the traditional approach, we assume that we know the exact sample values, that the prior distribution is exactly known, etc. In reality, the data is imprecise due to measurement errors, the prior knowledge is only approximately known, etc. So, a natural way to deal with the seemingly inconsistent information is to take this imprecision into account in the Bayesian approach - e.g., by using fuzzy techniques. In this paper, we describe several possible scenarios for fuzzifying the Bayesian approach. Particular attention is paid to the interaction between the estimated imprecise parameters. In this paper, to implement the corresponding fuzzy versions of the Bayesian formulas, we use straightforward computations of the related expression - which makes our computations reasonably time-consuming. Computations in the traditional (non-fuzzy) Bayesian approach are much faster - because they use algorithmically efficient reformulations of the Bayesian formulas. We expect that similar reformulations of the fuzzy Bayesian formulas will also drastically decrease the computation time and thus, enhance the practical use of the proposed methods.

  3. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to neura

  4. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  5. Interactive Instruction in Bayesian Inference

    DEFF Research Database (Denmark)

    Khan, Azam; Breslav, Simon; Hornbæk, Kasper

    2017-01-01

    An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....

  6. Bayesian regression of piecewise homogeneous Poisson processes

    Directory of Open Access Journals (Sweden)

    Diego Sevilla

    2015-12-01

    Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015

  7. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  8. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  9. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  10. A comparison of Bayesian and non-linear regression methods for robust estimation of pharmacokinetics in DCE-MRI and how it affects cancer diagnosis.

    Science.gov (United States)

    Dikaios, Nikolaos; Atkinson, David; Tudisca, Chiara; Purpura, Pierpaolo; Forster, Martin; Ahmed, Hashim; Beale, Timothy; Emberton, Mark; Punwani, Shonit

    2017-03-01

    The aim of this work is to compare Bayesian Inference for nonlinear models with commonly used traditional non-linear regression (NR) algorithms for estimating tracer kinetics in Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI). The algorithms are compared in terms of accuracy, and reproducibility under different initialization settings. Further it is investigated how a more robust estimation of tracer kinetics affects cancer diagnosis. The derived tracer kinetics from the Bayesian algorithm were validated against traditional NR algorithms (i.e. Levenberg-Marquardt, simplex) in terms of accuracy on a digital DCE phantom and in terms of goodness-of-fit (Kolmogorov-Smirnov test) on ROI-based concentration time courses from two different patient cohorts. The first cohort consisted of 76 men, 20 of whom had significant peripheral zone prostate cancer (any cancer-core-length (CCL) with Gleason>3+3 or any-grade with CCL>=4mm) following transperineal template prostate mapping biopsy. The second cohort consisted of 9 healthy volunteers and 24 patients with head and neck squamous cell carcinoma. The diagnostic ability of the derived tracer kinetics was assessed with receiver operating characteristic area under curve (ROC AUC) analysis. The Bayesian algorithm accurately recovered the ground-truth tracer kinetics for the digital DCE phantom consistently improving the Structural Similarity Index (SSIM) across the 50 different initializations compared to NR. For optimized initialization, Bayesian did not improve significantly the fitting accuracy on both patient cohorts, and it only significantly improved the ve ROC AUC on the HN population from ROC AUC=0.56 for the simplex to ROC AUC=0.76. For both cohorts, the values and the diagnostic ability of tracer kinetic parameters estimated with the Bayesian algorithm weren't affected by their initialization. To conclude, the Bayesian algorithm led to a more accurate and reproducible quantification of tracer kinetic

  11. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science primar

  12. Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions

    Directory of Open Access Journals (Sweden)

    Richard M. Todd

    1988-03-01

    Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.

  13. A bayesian belief network for reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Helminen, Atte

    2001-02-15

    The research programme at the Halden Project on software assessment is argumented through a joint project with VVT Automation. The objective of this co-operative project is to combine previous presented Bayesian Belief Networks for a software safety standard, with BBNs on the reliability estimation of software based digital systems. The results on applying BBN methodology with a software safety standard is based upon previous research by the Halden Project, while the results on the reliability estimation is based on a Master's Thesis by Helminen. The report should be considered as a progress report in the more long-term activity on the use of BBNs as support for safety assessment of programmable systems. In this report it is discussed how the two approaches can be merged together into one Bayesian Network, and the problems with merging are pinpointed. The report also presents and discusses the approaches applied by the Halden Project and VTT, including the differences in the expert judgement of the parameters used in the Bayesian Network. Finally, the report gives some experimental results based on observations from applying the method for an evaluation of a real, safety related programmable system that has been developed according to the avionic standard DO-178B. This demonstrates how hard and soft evidences can be combined for a reliability assessment. The use of Bayesian Networks provides a framework, combining consistent application of probability calculus with the ability to model complex structures, as e.g. standards, as a simple understandable network, where all possible evidence can be introduced to the reliability estimation in a compatible way. (Author)

  14. Bayesian inference in geomagnetism

    Science.gov (United States)

    Backus, George E.

    1988-01-01

    The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.

  15. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...... locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...... parameter values are unknown. The results show that in this situation a wide range of interpoint distances should be included in the design, and the widely used regular design is often not the best choice....

  16. Bayesian Rose Trees

    CERN Document Server

    Blundell, Charles; Heller, Katherine A

    2012-01-01

    Hierarchical structure is ubiquitous in data across many domains. There are many hier- archical clustering methods, frequently used by domain experts, which strive to discover this structure. However, most of these meth- ods limit discoverable hierarchies to those with binary branching structure. This lim- itation, while computationally convenient, is often undesirable. In this paper we ex- plore a Bayesian hierarchical clustering algo- rithm that can produce trees with arbitrary branching structure at each node, known as rose trees. We interpret these trees as mixtures over partitions of a data set, and use a computationally efficient, greedy ag- glomerative algorithm to find the rose trees which have high marginal likelihood given the data. Lastly, we perform experiments which demonstrate that rose trees are better models of data than the typical binary trees returned by other hierarchical clustering algorithms.

  17. Bayesian Causal Induction

    CERN Document Server

    Ortega, Pedro A

    2011-01-01

    Discovering causal relationships is a hard task, often hindered by the need for intervention, and often requiring large amounts of data to resolve statistical uncertainty. However, humans quickly arrive at useful causal relationships. One possible reason is that humans use strong prior knowledge; and rather than encoding hard causal relationships, they encode beliefs over causal structures, allowing for sound generalization from the observations they obtain from directly acting in the world. In this work we propose a Bayesian approach to causal induction which allows modeling beliefs over multiple causal hypotheses and predicting the behavior of the world under causal interventions. We then illustrate how this method extracts causal information from data containing interventions and observations.

  18. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  19. Bayesian adaptive methods for clinical trials

    CERN Document Server

    Berry, Scott M; Muller, Peter

    2010-01-01

    Already popular in the analysis of medical device trials, adaptive Bayesian designs are increasingly being used in drug development for a wide variety of diseases and conditions, from Alzheimer's disease and multiple sclerosis to obesity, diabetes, hepatitis C, and HIV. Written by leading pioneers of Bayesian clinical trial designs, Bayesian Adaptive Methods for Clinical Trials explores the growing role of Bayesian thinking in the rapidly changing world of clinical trial analysis. The book first summarizes the current state of clinical trial design and analysis and introduces the main ideas and potential benefits of a Bayesian alternative. It then gives an overview of basic Bayesian methodological and computational tools needed for Bayesian clinical trials. With a focus on Bayesian designs that achieve good power and Type I error, the next chapters present Bayesian tools useful in early (Phase I) and middle (Phase II) clinical trials as well as two recent Bayesian adaptive Phase II studies: the BATTLE and ISP...

  20. 八达岭太阳能塔式热发电吸热器水动力特性仿真研究%Hydrodynamic Characteristics and Operation Security Study of Solar Receiver for Badaling Solar Tower Power Plant

    Institute of Scientific and Technical Information of China (English)

    高维; 徐蕙; 徐二树; 余强

    2012-01-01

    This paper took overheating cavity receiver of Badaling 1 MW solar power tower plant as study object, developed a hydrodynamic simulation model for receiver based on the working principle of the superheating cavity receiver, simulated the hydrodynamic characteristics of cavity receiver, and obtained the change law of working medium mass flow in different heating surfaces with solar radiation. The results can provide some guidance to design the receiver system and to formulate the control and operating strategy for solar power tower plant.%本文以八达岭塔式太阳能热发电实验电站腔式吸热器为研究对象,根据吸热器的结构和工作原理,利用热力学定律建立了吸热器系统水动力仿真数学模型,模拟了吸热器蒸发受热面系统内工质的流动,通过吸热器水动力的仿真实验从而得到在不同的太阳辐照强度下吸热器蒸发受热面入口流量的分配规律。论文结论对实际太阳能热发电站吸热器的设计及安全运行具有指导意义。

  1. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  2. Bayesian Diagnostic Network: A Powerful Model for Representation and Reasoning of Engineering Diagnostic Knowledge

    Institute of Scientific and Technical Information of China (English)

    HU Zhao-yong

    2005-01-01

    Engineering diagnosis is essential to the operation of industrial equipment. The key to successful diagnosis is correct knowledge representation and reasoning. The Bayesian network is a powerful tool for it. This paper utilizes the Bayesian network to represent and reason diagnostic knowledge, named Bayesian diagnostic network. It provides a three-layer topologic structure based on operating conditions, possible faults and corresponding symptoms. The paper also discusses an approximate stochastic sampling algorithm. Then a practical Bayesian network for gas turbine diagnosis is constructed on a platform developed under a Visual C++ environment. It shows that the Bayesian network is a powerful model for representation and reasoning of diagnostic knowledge. The three-layer structure and the approximate algorithm are effective also.

  3. Irregular-Time Bayesian Networks

    CERN Document Server

    Ramati, Michael

    2012-01-01

    In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...

  4. Bayesian nonparametric adaptive control using Gaussian processes.

    Science.gov (United States)

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  5. Bayesian Tracking of Visual Objects

    Science.gov (United States)

    Zheng, Nanning; Xue, Jianru

    Tracking objects in image sequences involves performing motion analysis at the object level, which is becoming an increasingly important technology in a wide range of computer video applications, including video teleconferencing, security and surveillance, video segmentation, and editing. In this chapter, we focus on sequential Bayesian estimation techniques for visual tracking. We first introduce the sequential Bayesian estimation framework, which acts as the theoretic basis for visual tracking. Then, we present approaches to constructing representation models for specific objects.

  6. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...

  7. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  8. A Focused Bayesian Information Criterion

    OpenAIRE

    Georges Nguefack-Tsague; Ingo Bulla

    2014-01-01

    Myriads of model selection criteria (Bayesian and frequentist) have been proposed in the literature aiming at selecting a single model regardless of its intended use. An honorable exception in the frequentist perspective is the “focused information criterion” (FIC) aiming at selecting a model based on the parameter of interest (focus). This paper takes the same view in the Bayesian context; that is, a model may be good for one estimand but bad for another. The proposed method exploits the Bay...

  9. Bayesian analysis of CCDM Models

    OpenAIRE

    Jesus, J. F.; Valentim, R.; Andrade-Oliveira, F.

    2016-01-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, leads to negative creation pressure, which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical tools, at light of SN Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These approaches allow to compare models considering goodness of fit and numbe...

  10. Bayesian Methods for Statistical Analysis

    OpenAIRE

    Puza, Borek

    2015-01-01

    Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...

  11. Optimal Bayesian Adaptive Design for Test-Item Calibration

    NARCIS (Netherlands)

    Linden, van der Wim J.; Ren, Hao

    2015-01-01

    An optimal adaptive design for test-item calibration based on Bayesian optimality criteria is presented. The design adapts the choice of field-test items to the examinees taking an operational adaptive test using both the information in the posterior distributions of their ability parameters and the

  12. Dynamic Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2011-01-01

    Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...

  13. Bayesian parameter estimation for chiral effective field theory

    Science.gov (United States)

    Wesolowski, Sarah; Furnstahl, Richard; Phillips, Daniel; Klco, Natalie

    2016-09-01

    The low-energy constants (LECs) of a chiral effective field theory (EFT) interaction in the two-body sector are fit to observable data using a Bayesian parameter estimation framework. By using Bayesian prior probability distributions (pdfs), we quantify relevant physical expectations such as LEC naturalness and include them in the parameter estimation procedure. The final result is a posterior pdf for the LECs, which can be used to propagate uncertainty resulting from the fit to data to the final observable predictions. The posterior pdf also allows an empirical test of operator redundancy and other features of the potential. We compare results of our framework with other fitting procedures, interpreting the underlying assumptions in Bayesian probabilistic language. We also compare results from fitting all partial waves of the interaction simultaneously to cross section data compared to fitting to extracted phase shifts, appropriately accounting for correlations in the data. Supported in part by the NSF and DOE.

  14. Bayesian population finding with biomarkers in a randomized clinical trial.

    Science.gov (United States)

    Morita, Satoshi; Müller, Peter

    2017-03-03

    The identification of good predictive biomarkers allows investigators to optimize the target population for a new treatment. We propose a novel utility-based Bayesian population finding (BaPoFi) method to analyze data from a randomized clinical trial with the aim of finding a sensitive patient population. Our approach is based on casting the population finding process as a formal decision problem together with a flexible probability model, Bayesian additive regression trees (BART), to summarize observed data. The proposed method evaluates enhanced treatment effects in patient subpopulations based on counter-factual modeling of responses to new treatment and control for each patient. In extensive simulation studies, we examine the operating characteristics of the proposed method. We compare with a Bayesian regression-based method that implements shrinkage estimates of subgroup-specific treatment effects. For illustration, we apply the proposed method to data from a randomized clinical trial.

  15. Bayesian Model Selection and Statistical Modeling

    CERN Document Server

    Ando, Tomohiro

    2010-01-01

    Bayesian model selection is a fundamental part of the Bayesian statistical modeling process. The quality of these solutions usually depends on the goodness of the constructed Bayesian model. Realizing how crucial this issue is, many researchers and practitioners have been extensively investigating the Bayesian model selection problem. This book provides comprehensive explanations of the concepts and derivations of the Bayesian approach for model selection and related criteria, including the Bayes factor, the Bayesian information criterion (BIC), the generalized BIC, and the pseudo marginal lik

  16. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  17. Bayesian microsaccade detection

    Science.gov (United States)

    Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji

    2017-01-01

    Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483

  18. Bayesian Discovery of Linear Acyclic Causal Models

    CERN Document Server

    Hoyer, Patrik O

    2012-01-01

    Methods for automated discovery of causal relationships from non-interventional data have received much attention recently. A widely used and well understood model family is given by linear acyclic causal models (recursive structural equation models). For Gaussian data both constraint-based methods (Spirtes et al., 1993; Pearl, 2000) (which output a single equivalence class) and Bayesian score-based methods (Geiger and Heckerman, 1994) (which assign relative scores to the equivalence classes) are available. On the contrary, all current methods able to utilize non-Gaussianity in the data (Shimizu et al., 2006; Hoyer et al., 2008) always return only a single graph or a single equivalence class, and so are fundamentally unable to express the degree of certainty attached to that output. In this paper we develop a Bayesian score-based approach able to take advantage of non-Gaussianity when estimating linear acyclic causal models, and we empirically demonstrate that, at least on very modest size networks, its accur...

  19. Maximum margin Bayesian network classifiers.

    Science.gov (United States)

    Pernkopf, Franz; Wohlmayr, Michael; Tschiatschek, Sebastian

    2012-03-01

    We present a maximum margin parameter learning algorithm for Bayesian network classifiers using a conjugate gradient (CG) method for optimization. In contrast to previous approaches, we maintain the normalization constraints on the parameters of the Bayesian network during optimization, i.e., the probabilistic interpretation of the model is not lost. This enables us to handle missing features in discriminatively optimized Bayesian networks. In experiments, we compare the classification performance of maximum margin parameter learning to conditional likelihood and maximum likelihood learning approaches. Discriminative parameter learning significantly outperforms generative maximum likelihood estimation for naive Bayes and tree augmented naive Bayes structures on all considered data sets. Furthermore, maximizing the margin dominates the conditional likelihood approach in terms of classification performance in most cases. We provide results for a recently proposed maximum margin optimization approach based on convex relaxation. While the classification results are highly similar, our CG-based optimization is computationally up to orders of magnitude faster. Margin-optimized Bayesian network classifiers achieve classification performance comparable to support vector machines (SVMs) using fewer parameters. Moreover, we show that unanticipated missing feature values during classification can be easily processed by discriminatively optimized Bayesian network classifiers, a case where discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.

  20. A Focused Bayesian Information Criterion

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2014-01-01

    Full Text Available Myriads of model selection criteria (Bayesian and frequentist have been proposed in the literature aiming at selecting a single model regardless of its intended use. An honorable exception in the frequentist perspective is the “focused information criterion” (FIC aiming at selecting a model based on the parameter of interest (focus. This paper takes the same view in the Bayesian context; that is, a model may be good for one estimand but bad for another. The proposed method exploits the Bayesian model averaging (BMA machinery to obtain a new criterion, the focused Bayesian model averaging (FoBMA, for which the best model is the one whose estimate is closest to the BMA estimate. In particular, for two models, this criterion reduces to the classical Bayesian model selection scheme of choosing the model with the highest posterior probability. The new method is applied in linear regression, logistic regression, and survival analysis. This criterion is specially important in epidemiological studies in which the objective is often to determine a risk factor (focus for a disease, adjusting for potential confounding factors.

  1. A Bayesian Game-Theoretic Approach for Distributed Resource Allocation in Fading Multiple Access Channels

    Directory of Open Access Journals (Sweden)

    Gaoning He

    2010-01-01

    Full Text Available A Bayesian game-theoretic model is developed to design and analyze the resource allocation problem in K-user fading multiple access channels (MACs, where the users are assumed to selfishly maximize their average achievable rates with incomplete information about the fading channel gains. In such a game-theoretic study, the central question is whether a Bayesian equilibrium exists, and if so, whether the network operates efficiently at the equilibrium point. We prove that there exists exactly one Bayesian equilibrium in our game. Furthermore, we study the network sum-rate maximization problem by assuming that the users coordinate according to a symmetric strategy profile. This result also serves as an upper bound for the Bayesian equilibrium. Finally, simulation results are provided to show the network efficiency at the unique Bayesian equilibrium and to compare it with other strategies.

  2. Using robust Bayesian network to estimate the residuals of fluoroquinolone antibiotic in soil.

    Science.gov (United States)

    Li, Xuewen; Xie, Yunfeng; Li, Lianfa; Yang, Xunfeng; Wang, Ning; Wang, Jinfeng

    2015-11-01

    Prediction of antibiotic pollution and its consequences is difficult, due to the uncertainties and complexities associated with multiple related factors. This article employed domain knowledge and spatial data to construct a Bayesian network (BN) model to assess fluoroquinolone antibiotic (FQs) pollution in the soil of an intensive vegetable cultivation area. The results show: (1) The relationships between FQs pollution and contributory factors: Three factors (cultivation methods, crop rotations, and chicken manure types) were consistently identified as predictors in the topological structures of three FQs, indicating their importance in FQs pollution; deduced with domain knowledge, the cultivation methods are determined by the crop rotations, which require different nutrients (derived from the manure) according to different plant biomass. (2) The performance of BN model: The integrative robust Bayesian network model achieved the highest detection probability (pd) of high-risk and receiver operating characteristic (ROC) area, since it incorporates domain knowledge and model uncertainty. Our encouraging findings have implications for the use of BN as a robust approach to assessment of FQs pollution and for informing decisions on appropriate remedial measures.

  3. An Advanced Bayesian Method for Short-Term Probabilistic Forecasting of the Generation of Wind Power

    Directory of Open Access Journals (Sweden)

    Antonio Bracale

    2015-09-01

    Full Text Available Currently, among renewable distributed generation systems, wind generators are receiving a great deal of interest due to the great economic, technological, and environmental incentives they involve. However, the uncertainties due to the intermittent nature of wind energy make it difficult to operate electrical power systems optimally and make decisions that satisfy the needs of all the stakeholders of the electricity energy market. Thus, there is increasing interest determining how to forecast wind power production accurately. Most the methods that have been published in the relevant literature provided deterministic forecasts even though great interest has been focused recently on probabilistic forecast methods. In this paper, an advanced probabilistic method is proposed for short-term forecasting of wind power production. A mixture of two Weibull distributions was used as a probability function to model the uncertainties associated with wind speed. Then, a Bayesian inference approach with a particularly-effective, autoregressive, integrated, moving-average model was used to determine the parameters of the mixture Weibull distribution. Numerical applications also are presented to provide evidence of the forecasting performance of the Bayesian-based approach.

  4. Continuously rethinking the definition of influenza for surveillance systems: a Dependent Bayesian Expert System.

    Science.gov (United States)

    Alemi, Farrokh; Atherton, Martin J; Pattie, David C; Torii, Manabu

    2013-08-01

    In the Electronic Surveillance System for the Early Notification of Community-based Epidemics (ESSENCE), influenza was originally defined by a list of 29 and later by a list of 12 diagnosis codes. This article describes a dependent Bayesian procedure designed to improve the ESSENCE system and exploit multiple sources of information without being biased by redundancy. We obtained 13,096 cases within the Armed Forces Health Longitudinal Technological Application electronic medical records that included an influenza laboratory test. A Dependent Bayesian Expert System (D-BESt) was used to predict influenza from diagnoses, symptoms, reason for visit, temperature, month of visit, category of enrollment, and demographics. For each case, D-BESt sequentially selects the most discriminating piece of information, calculates its likelihood ratio conditioned on previously selected information, and updates the case's probability of influenza. When the analysis was limited to definitions based on diagnoses and was applied to a sample of patients for whom laboratory tests had been ordered, the areas under the receiver operating characteristic curve (AUCs) for the previous (29-diagnosis) and current (12-diagnosis) ESSENCE lists and the D-BESt algorithm were, respectively, 0.47, 0.36, and 0.77. Including other sources of information further improved the AUC for D-BESt to 0.79. At the best cutoff point for D-BESt, where the receiver operating characteristic curve for D-BESt is farthest from the diagonal line, the D-BESt algorithm correctly classified 84% of cases (specificity = 88%, sensitivity = 62%). In comparison, the current ESSENCE approach of using a list of 12 diagnoses correctly classified only 31% of this sample of cases (specificity = 29%, sensitivity = 42%). False alarms in ESSENCE surveillance systems can be reduced if a probabilistic dynamic learning system is used.

  5. Free will in Bayesian and inverse Bayesian inference-driven endo-consciousness.

    Science.gov (United States)

    Gunji, Yukio-Pegio; Minoura, Mai; Kojima, Kei; Horry, Yoichi

    2017-06-27

    How can we link challenging issues related to consciousness and/or qualia with natural science? The introduction of endo-perspective, instead of exo-perspective, as proposed by Matsuno, Rössler, and Gunji, is considered one of the most promising candidate approaches. Here, we distinguish the endo-from the exo-perspective in terms of whether the external is or is not directly operated. In the endo-perspective, the external can be neither perceived nor recognized directly; rather, one can only indirectly summon something outside of the perspective, which can be illustrated by a causation-reversal pair. On one hand, causation logically proceeds from the cause to the effect. On the other hand, a reversal from the effect to the cause is non-logical and is equipped with a metaphorical structure. We argue that the differences in exo- and endo-perspectives result not from the difference between Western and Eastern cultures, but from differences between modernism and animism. Here, a causation-reversal pair described using a pair of upward (from premise to consequence) and downward (from consequence to premise) causation and a pair of Bayesian and inverse Bayesian inference (BIB inference). Accordingly, the notion of endo-consciousness is proposed as an agent equipped with BIB inference. We also argue that BIB inference can yield both highly efficient computations through Bayesian interference and robust computations through inverse Bayesian inference. By adapting a logical model of the free will theorem to the BIB inference, we show that endo-consciousness can explain free will as a regression of the controllability of voluntary action. Copyright © 2017. Published by Elsevier Ltd.

  6. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  7. Introduction to Bayesian modelling in dental research.

    Science.gov (United States)

    Gilthorpe, M S; Maddick, I H; Petrie, A

    2000-12-01

    To explain the concepts and application of Bayesian modelling and how it can be applied to the analysis of dental research data. Methodological in nature, this article introduces Bayesian modelling through hypothetical dental examples. The synthesis of RCT results with previous evidence, including expert opinion, is used to illustrate full Bayesian modelling. Meta-analysis, in the form of empirical Bayesian modelling, is introduced. An example of full Bayesian modelling is described for the synthesis of evidence from several studies that investigate the success of root canal treatment. Hierarchical (Bayesian) modelling is demonstrated for a survey of childhood caries, where surface data is nested within subjects. Bayesian methods enhance interpretation of research evidence through the synthesis of information from multiple sources. Bayesian modelling is now readily accessible to clinical researchers and is able to augment the application of clinical decision making in the development of guidelines and clinical practice.

  8. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  9. Electronic warfare receivers and receiving systems

    CERN Document Server

    Poisel, Richard A

    2014-01-01

    Receivers systems are considered the core of electronic warfare (EW) intercept systems. Without them, the fundamental purpose of such systems is null and void. This book considers the major elements that make up receiver systems and the receivers that go in them.This resource provides system design engineers with techniques for design and development of EW receivers for modern modulations (spread spectrum) in addition to receivers for older, common modulation formats. Each major module in these receivers is considered in detail. Design information is included as well as performance tradeoffs o

  10. Bayesian Missile System Reliability from Point Estimates

    Science.gov (United States)

    2014-10-28

    OCT 2014 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Bayesian Missile System Reliability from Point Estimates 5a. CONTRACT...Principle (MEP) to convert point estimates to probability distributions to be used as priors for Bayesian reliability analysis of missile data, and...illustrate this approach by applying the priors to a Bayesian reliability model of a missile system. 15. SUBJECT TERMS priors, Bayesian , missile

  11. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  12. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    , and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental......The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...

  13. Bayesian test and Kuhn's paradigm

    Institute of Scientific and Technical Information of China (English)

    Chen Xiaoping

    2006-01-01

    Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.

  14. Perception, illusions and Bayesian inference.

    Science.gov (United States)

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.

  15. Bayesian prediction of placebo analgesia in an instrumental learning model

    Science.gov (United States)

    Jung, Won-Mo; Lee, Ye-Seul; Wallraven, Christian; Chae, Younbyoung

    2017-01-01

    Placebo analgesia can be primarily explained by the Pavlovian conditioning paradigm in which a passively applied cue becomes associated with less pain. In contrast, instrumental conditioning employs an active paradigm that might be more similar to clinical settings. In the present study, an instrumental conditioning paradigm involving a modified trust game in a simulated clinical situation was used to induce placebo analgesia. Additionally, Bayesian modeling was applied to predict the placebo responses of individuals based on their choices. Twenty-four participants engaged in a medical trust game in which decisions to receive treatment from either a doctor (more effective with high cost) or a pharmacy (less effective with low cost) were made after receiving a reference pain stimulus. In the conditioning session, the participants received lower levels of pain following both choices, while high pain stimuli were administered in the test session even after making the decision. The choice-dependent pain in the conditioning session was modulated in terms of both intensity and uncertainty. Participants reported significantly less pain when they chose the doctor or the pharmacy for treatment compared to the control trials. The predicted pain ratings based on Bayesian modeling showed significant correlations with the actual reports from participants for both of the choice categories. The instrumental conditioning paradigm allowed for the active choice of optional cues and was able to induce the placebo analgesia effect. Additionally, Bayesian modeling successfully predicted pain ratings in a simulated clinical situation that fits well with placebo analgesia induced by instrumental conditioning. PMID:28225816

  16. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have b...

  17. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  18. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  19. Bayesian networks and food security - An introduction

    NARCIS (Netherlands)

    Stein, A.

    2004-01-01

    This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup

  20. Bayesian Analysis of High Dimensional Classification

    Science.gov (United States)

    Mukhopadhyay, Subhadeep; Liang, Faming

    2009-12-01

    Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.

  1. Bayesian Inference and Online Learning in Poisson Neuronal Networks.

    Science.gov (United States)

    Huang, Yanping; Rao, Rajesh P N

    2016-08-01

    Motivated by the growing evidence for Bayesian computation in the brain, we show how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model. The lower-layer sensory neurons receive noisy measurements of hidden world states. The higher-layer neurons infer a posterior distribution over world states via Bayesian inference from inputs generated by sensory neurons. We demonstrate how such a neuronal network with synaptic plasticity can implement a form of Bayesian inference similar to Monte Carlo methods such as particle filtering. Each spike in a higher-layer neuron represents a sample of a particular hidden world state. The spiking activity across the neural population approximates the posterior distribution over hidden states. In this model, variability in spiking is regarded not as a nuisance but as an integral feature that provides the variability necessary for sampling during inference. We demonstrate how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule. We present results illustrating the ability of the network to perform inference and learning for arbitrary hidden Markov models.

  2. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    2016-11-01

    Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.

  3. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...

  4. Bayesian Analysis of Experimental Data

    Directory of Open Access Journals (Sweden)

    Lalmohan Bhar

    2013-10-01

    Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.

  5. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  6. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  7. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  8. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  9. 3-D contextual Bayesian classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...

  10. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  11. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  12. Bayesian Evidence and Model Selection

    CERN Document Server

    Knuth, Kevin H; Malakar, Nabin K; Mubeen, Asim M; Placek, Ben

    2014-01-01

    In this paper we review the concept of the Bayesian evidence and its application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. Application to several practical examples within the context of signal processing are discussed.

  13. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  14. Bayesian stable isotope mixing models

    Science.gov (United States)

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...

  15. Naive Bayesian for Email Filtering

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The paper presents a method of email filter based on Naive Bayesian theory that can effectively filter junk mail and illegal mail. Furthermore, the keys of implementation are discussed in detail. The filtering model is obtained from training set of email. The filtering can be done without the users specification of filtering rules.

  16. ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY

    Directory of Open Access Journals (Sweden)

    Felipe Schneider Costa

    2013-01-01

    Full Text Available The naïve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naïve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naïve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naïve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.

  17. Bayesian tests of measurement invariance

    NARCIS (Netherlands)

    Verhagen, A.J.; Fox, J.P.

    2013-01-01

    Random item effects models provide a natural framework for the exploration of violations of measurement invariance without the need for anchor items. Within the random item effects modelling framework, Bayesian tests (Bayes factor, deviance information criterion) are proposed which enable multiple m

  18. Bayesian NL interpretation and learning

    NARCIS (Netherlands)

    Zeevat, H.

    2011-01-01

    Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language

  19. Bayesian analysis of binary sequences

    Science.gov (United States)

    Torney, David C.

    2005-03-01

    This manuscript details Bayesian methodology for "learning by example", with binary n-sequences encoding the objects under consideration. Priors prove influential; conformable priors are described. Laplace approximation of Bayes integrals yields posterior likelihoods for all n-sequences. This involves the optimization of a definite function over a convex domain--efficiently effectuated by the sequential application of the quadratic program.

  20. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  1. Bayesian Alternation During Tactile Augmentation

    Directory of Open Access Journals (Sweden)

    Caspar Mathias Goeke

    2016-10-01

    Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  2. Bayesian Networks as a Decision Tool for O&M of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Nielsen, Jannie Jessen; Sørensen, John Dalsgaard

    2010-01-01

    Costs to operation and maintenance (O&M) of offshore wind turbines are large. This paper presents how influence diagrams can be used to assist in rational decision making for O&M. An influence diagram is a graphical representation of a decision tree based on Bayesian Networks. Bayesian Networks...... offer efficient Bayesian updating of a damage model when imperfect information from inspections/monitoring is available. The extension to an influence diagram offers the calculation of expected utilities for decision alternatives, and can be used to find the optimal strategy among different alternatives...

  3. Minimum mean square error estimation and approximation of the Bayesian update

    KAUST Repository

    Litvinenko, Alexander

    2015-01-07

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.

  4. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  5. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  6. STATISTICAL BAYESIAN ANALYSIS OF EXPERIMENTAL DATA.

    Directory of Open Access Journals (Sweden)

    AHLAM LABDAOUI

    2012-12-01

    Full Text Available The Bayesian researcher should know the basic ideas underlying Bayesian methodology and the computational tools used in modern Bayesian econometrics.  Some of the most important methods of posterior simulation are Monte Carlo integration, importance sampling, Gibbs sampling and the Metropolis- Hastings algorithm. The Bayesian should also be able to put the theory and computational tools together in the context of substantive empirical problems. We focus primarily on recent developments in Bayesian computation. Then we focus on particular models. Inevitably, we combine theory and computation in the context of particular models. Although we have tried to be reasonably complete in terms of covering the basic ideas of Bayesian theory and the computational tools most commonly used by the Bayesian, there is no way we can cover all the classes of models used in econometrics. We propose to the user of analysis of variance and linear regression model.

  7. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  8. Receiver-exciter controller design

    Science.gov (United States)

    Jansma, P. A.

    1982-01-01

    A description of the general design of both the block 3 and block 4 receiver-exciter controllers for the Deep Space Network (DSN) Mark IV-A System is presented along with the design approach. The controllers are designed to enable the receiver-exciter subsystem (RCV) to be configured, calibrated, initialized and operated from a central location via high level instructions. The RECs are designed to be operated under the control of the DMC subsystem. The instructions are in the form of standard subsystem blocks (SSBs) received via the local area network (LAN). The centralized control provided by RECs and other DSCC controllers in Mark IV-A is intended to reduce DSN operations costs from the Mark III era.

  9. Comparison between differently priced devices for digital capture of X-ray films using computed tomography as a gold standard: a multireader-multicase receiver operating characteristic curve study.

    Science.gov (United States)

    Salazar, Antonio J; Camacho, Juan Camilo; Aguirre, Diego Andrés

    2011-05-01

    Film digitizers are a specialized technology that is available for scanning X-ray radiographs; however, their cost makes them unaffordable for developing countries. Thus, less expensive alternatives are used. The purpose of this study was to compare three devices for digital capture of X-ray films: a film digitizer (US $15,000), a flatbed scanner (US $1800), and a 10-megapixel digital camera (US $450), in terms of diagnostic accuracy, defined as the area under the receiver operating characteristic curves and computed tomography as the gold standard. The sample included 136 chest X-ray cases with computed tomography confirmation of the presence or absence of pneumothorax, interstitial opacities, or nodules. The readers were six radiologists who made observations of eight variables for each digital capture of the X-ray films: three main variables to determine the accuracy in the detection of the above-mentioned pathologies, four secondary variables to categorize other pathological classifications, and one variable regarding digital image quality. The receiver operating characteristic curves for each device and pathology were very similar. For the main variables, there was no significant statistical difference in diagnostic accuracy between the devices. For the secondary variables, >84% of cases were correctly classified, even those that were classified with the lowest image quality. High accuracy was determined for the three main variables (0.75 to 0.96), indicating good performance for all tested devices, despite their very different prices. Choosing a device for a teleradiology service should involve additional factors, such as capture time, maintenance concerns, and training requirements.

  10. Bayesian inference of chemical kinetic models from proposed reactions

    KAUST Repository

    Galagali, Nikhil

    2015-02-01

    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  11. General and Local: Averaged k-Dependence Bayesian Classifiers

    Directory of Open Access Journals (Sweden)

    Limin Wang

    2015-06-01

    Full Text Available The inference of a general Bayesian network has been shown to be an NP-hard problem, even for approximate solutions. Although k-dependence Bayesian (KDB classifier can construct at arbitrary points (values of k along the attribute dependence spectrum, it cannot identify the changes of interdependencies when attributes take different values. Local KDB, which learns in the framework of KDB, is proposed in this study to describe the local dependencies implicated in each test instance. Based on the analysis of functional dependencies, substitution-elimination resolution, a new type of semi-naive Bayesian operation, is proposed to substitute or eliminate generalization to achieve accurate estimation of conditional probability distribution while reducing computational complexity. The final classifier, averaged k-dependence Bayesian (AKDB classifiers, will average the output of KDB and local KDB. Experimental results on the repository of machine learning databases from the University of California Irvine (UCI showed that AKDB has significant advantages in zero-one loss and bias relative to naive Bayes (NB, tree augmented naive Bayes (TAN, Averaged one-dependence estimators (AODE, and KDB. Moreover, KDB and local KDB show mutually complementary characteristics with respect to variance.

  12. Using Bayesian neural networks to classify forest scenes

    Science.gov (United States)

    Vehtari, Aki; Heikkonen, Jukka; Lampinen, Jouko; Juujarvi, Jouni

    1998-10-01

    We present results that compare the performance of Bayesian learning methods for neural networks on the task of classifying forest scenes into trees and background. Classification task is demanding due to the texture richness of the trees, occlusions of the forest scene objects and diverse lighting conditions under operation. This makes it difficult to determine which are optimal image features for the classification. A natural way to proceed is to extract many different types of potentially suitable features, and to evaluate their usefulness in later processing stages. One approach to cope with large number of features is to use Bayesian methods to control the model complexity. Bayesian learning uses a prior on model parameters, combines this with evidence from a training data, and the integrates over the resulting posterior to make predictions. With this method, we can use large networks and many features without fear of overfitting. For this classification task we compare two Bayesian learning methods for multi-layer perceptron (MLP) neural networks: (1) The evidence framework of MacKay uses a Gaussian approximation to the posterior weight distribution and maximizes with respect to hyperparameters. (2) In a Markov Chain Monte Carlo (MCMC) method due to Neal, the posterior distribution of the network parameters is numerically integrated using the MCMC method. As baseline classifiers for comparison we use (3) MLP early stop committee, (4) K-nearest-neighbor and (5) Classification And Regression Tree.

  13. The Mechanism of Company Accounts Receivable Management

    Directory of Open Access Journals (Sweden)

    Halyna Yamnenko

    2017-02-01

    Full Text Available The relevance of the accounts receivable management is caused by its ability to influence on the filling of the company working capital. Therefore it is necessary to create a specific mechanism for management of accounts receivable in the company. The article analyses the components of the mechanism and the influence of factors that significantly affect the operation. The result of the functioning of the accounts receivable management is to receive funds and to minimize accounts receivable.

  14. Target distribution in cooperative combat based on Bayesian optimization algorithm

    Institute of Scientific and Technical Information of China (English)

    Shi Zhifu; Zhang An; Wang Anli

    2006-01-01

    Target distribution in cooperative combat is a difficult and emphases. We build up the optimization model according to the rule of fire distribution. We have researched on the optimization model with BOA. The BOA can estimate the joint probability distribution of the variables with Bayesian network, and the new candidate solutions also can be generated by the joint distribution. The simulation example verified that the method could be used to solve the complex question, the operation was quickly and the solution was best.

  15. Doubly Bayesian Analysis of Confidence in Perceptual Decision-Making.

    OpenAIRE

    Aitchison, L.; Bang, D; Bahrami, B.; Latham, P.E.

    2015-01-01

    Humans stand out from other animals in that they are able to explicitly report on the reliability of their internal operations. This ability, which is known as metacognition, is typically studied by asking people to report their confidence in the correctness of some decision. However, the computations underlying confidence reports remain unclear. In this paper, we present a fully Bayesian method for directly comparing models of confidence. Using a visual two-interval forced-choice task, we te...

  16. Exatidão de posicionamento de um receptor GPS, operando sob diferentes coberturas vegetais Evaluation of the accuracy of positioning a GPS receiver operating under different vegetation covers

    Directory of Open Access Journals (Sweden)

    Rubens Angulo Filho

    2002-01-01

    Full Text Available Para avaliar a exatidão de posicionamento planimétrico do receptor GPS Trimble/Pro-XL, operando sob diferentes condições de cobertura vegetal (pastagem, seringueira, eucalipto e pinus, o equipamento foi posicionado alternadamente sobre 6 pontos, locados ao acaso nas áreas de estudo, variando o tempo de permanência (1 , 5 e 10 min mas com a mesma taxa de aquisição de dados (1 s fazendo-se, posteriormente, a correção diferencial (DGPS pós-processada dos dados. Os pontos também tiveram suas coordenadas levantadas pelo método topográfico, segundo a NBR 13133 - Execução de Levantamento Topográfico, para fins de comparação. De acordo com o método empregado e os resultados obtidos, foi possível separar as exatidões de posicionamento planimétrico, conforme o tipo de cobertura vegetal, em dois grupos: sem e com cobertura arbórea confirmando, assim, a interferência do dossel na recepção dos sinais emitidos pelos satélites GPS. O aumento do tempo de permanência melhorou a exatidão de posicionamento planimétrico, o que ratifica a escolha da metodologia de levantamento como sendo fundamental para a obtenção de bons resultados de posicionamento.To evaluate planimetric positioning accuracy of a GPS receiver (Trimble/Pro-XL, operating under different conditions of vegetation cover (pasture, rubber trees, eucalyptus and pine trees, 6 control points were located randomly in the study area. For comparison, their coordinates were first obtained by a conventional surveying method, according to NBR 13133 of Brazilian Surveying Standards. Afterwards, the GPS receiver was positioned on those control points, maintaining the acquisition rate of 1 s while changing the time for 1, 5 and 10 min, the DGPS method was used to correct the positioning coordinate data. According to the methodology applied and the results obtained, it was possible to distinguish planimetric positioning accuracy, according to the vegetation cover, in two groups

  17. Research on Operation Security of Solar Thermal Tower Power Plant Receiver%塔式太阳能热发电吸热器运行安全性研究

    Institute of Scientific and Technical Information of China (English)

    高维; 徐蕙; 徐二树; 余强

    2013-01-01

    The Superheating cavity receiver of Badaling 1MW solar thermal power plant had produced successfully superheated steam in August 2011. Because the direct normal irradiation value is changing with time, flux density on each evaporating heating surfaces is different and Superheater has occurred deformation. According to structure of evaporating heating surfaces of Badaling 1 MW solar thermal power plant receiver, a 7-channel evaporator heating surface dynamic simulation model is developed. This model could better simulate the dynamic characteristics of receiver. Basis on the model, this paper analyzed the operation security of the overheating cavity receiver. For evaporative heating surfaces, the most dangerous working condition is a sudden increase in solar irradiance at the high load; for superheator, the most dangerous working condition is a sudden decrease in the turbine regulating valve at the high load.%八达岭1MW塔式太阳能腔式吸热器已于2011年8月成功生产出过热蒸汽.由于太阳辐照随时间变化的特点,腔式吸热器各受热面能流密度分布极不均匀.为了研究腔式吸热器的运行安全性,文中根据八达岭1MW塔式太阳能过热型腔式吸热器蒸发受热面的结构形式,建立了7通道蒸发受热面动态仿真数学模型.该模型能够反映不同蒸发受热面工质流量随能流密度变化的规律性,更好地模拟吸热器动态特性.在此基础上,通过仿真实验分析了过热型腔式吸热器运器运行安全性.对于蒸发受热面,最危险的工况是在高负荷时太阳辐照突然增强;对于过热受热面,最危险的工况是,高负荷时汽轮机调节汽门突然关小.

  18. A new Bayesian network-based risk stratification model for prediction of short-term and long-term LVAD mortality.

    Science.gov (United States)

    Loghmanpour, Natasha A; Kanwar, Manreet K; Druzdzel, Marek J; Benza, Raymond L; Murali, Srinivas; Antaki, James F

    2015-01-01

    Existing risk assessment tools for patient selection for left ventricular assist devices (LVADs) such as the Destination Therapy Risk Score and HeartMate II Risk Score (HMRS) have limited predictive ability. This study aims to overcome the limitations of traditional statistical methods by performing the first application of Bayesian analysis to the comprehensive Interagency Registry for Mechanically Assisted Circulatory Support dataset and comparing it to HMRS. We retrospectively analyzed 8,050 continuous flow LVAD patients and 226 preimplant variables. We then derived Bayesian models for mortality at each of five time end-points postimplant (30 days, 90 days, 6 month, 1 year, and 2 years), achieving accuracies of 95%, 90%, 90%, 83%, and 78%, Kappa values of 0.43, 0.37, 0.37, 0.45, and 0.43, and area under the receiver operator characteristic (ROC) of 91%, 82%, 82%, 80%, and 81%, respectively. This was in comparison to the HMRS with an ROC of 57% and 60% at 90 days and 1 year, respectively. Preimplant interventions, such as dialysis, ECMO, and ventilators were major contributing risk markers. Bayesian models have the ability to reliably represent the complex causal relations of multiple variables on clinical outcomes. Their potential to develop a reliable risk stratification tool for use in clinical decision making on LVAD patients encourages further investigation.

  19. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.

  20. Bayesian phylogeny analysis via stochastic approximation Monte Carlo.

    Science.gov (United States)

    Cheon, Sooyoung; Liang, Faming

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time.

  1. Bayesian Variable Selection in Cost-Effectiveness Analysis

    Directory of Open Access Journals (Sweden)

    Miguel A. Negrín

    2010-04-01

    Full Text Available Linear regression models are often used to represent the cost and effectiveness of medical treatment. The covariates used may include sociodemographic variables, such as age, gender or race; clinical variables, such as initial health status, years of treatment or the existence of concomitant illnesses; and a binary variable indicating the treatment received. However, most studies estimate only one model, which usually includes all the covariates. This procedure ignores the question of uncertainty in model selection. In this paper, we examine four alternative Bayesian variable selection methods that have been proposed. In this analysis, we estimate the inclusion probability of each covariate in the real model conditional on the data. Variable selection can be useful for estimating incremental effectiveness and incremental cost, through Bayesian model averaging, as well as for subgroup analysis.

  2. Bayesian Inference for Radio Observations

    CERN Document Server

    Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin

    2015-01-01

    (Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...

  3. Deep Learning and Bayesian Methods

    Science.gov (United States)

    Prosper, Harrison B.

    2017-03-01

    A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  4. Bayesian approach to rough set

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.

  5. BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS

    Directory of Open Access Journals (Sweden)

    Thordis Linda Thorarinsdottir

    2011-05-01

    Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.

  6. Bayesian Source Separation and Localization

    CERN Document Server

    Knuth, K H

    1998-01-01

    The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...

  7. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  8. Bayesian priors for transiting planets

    CERN Document Server

    Kipping, David M

    2016-01-01

    As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...

  9. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  10. Managing Medicare receivables after PIP.

    Science.gov (United States)

    Loria, L S

    1987-04-01

    The luxury of PIP is gone and managing cash flow will become more important than ever before. The hospital industry has come a long way in the development of automated billing systems and related recordkeeping since PIP was first introduced. The performance of an operations review of the accounts receivable management system should improve the effectiveness and efficiency of operations and significantly improve cash flow.

  11. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  12. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....... are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...

  13. Bayesian analysis for kaon photoproduction

    Energy Technology Data Exchange (ETDEWEB)

    Marsainy, T., E-mail: tmart@fisika.ui.ac.id; Mart, T., E-mail: tmart@fisika.ui.ac.id [Department Fisika, FMIPA, Universitas Indonesia, Depok 16424 (Indonesia)

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  14. Bayesian priors and nuisance parameters

    CERN Document Server

    Gupta, Sourendu

    2016-01-01

    Bayesian techniques are widely used to obtain spectral functions from correlators. We suggest a technique to rid the results of nuisance parameters, ie, parameters which are needed for the regularization but cannot be determined from data. We give examples where the method works, including a pion mass extraction with two flavours of staggered quarks at a lattice spacing of about 0.07 fm. We also give an example where the method does not work.

  15. Bayesian kinematic earthquake source models

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.

    2009-12-01

    Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high

  16. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    . This allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion....... It is shown how half-cell potential measurements may be utilized to update the probability of excessive repair after 50 years....

  17. Bayesian second law of thermodynamics.

    Science.gov (United States)

    Bartolotta, Anthony; Carroll, Sean M; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+〈Q〉_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and 〈Q〉_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.

  18. Bayesian second law of thermodynamics

    Science.gov (United States)

    Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.

  19. Bayesian calibration of simultaneity in audiovisual temporal order judgments.

    Directory of Open Access Journals (Sweden)

    Shinya Yamamoto

    Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.

  20. Bayesian networks for fMRI: a primer.

    Science.gov (United States)

    Mumford, Jeanette A; Ramsey, Joseph D

    2014-02-01

    Bayesian network analysis is an attractive approach for studying the functional integration of brain networks, as it includes both the locations of connections between regions of the brain (functional connectivity) and more importantly the direction of the causal relationship between the regions (directed functional connectivity). Further, these approaches are more attractive than other functional connectivity analyses in that they can often operate on larger sets of nodes and run searches over a wide range of candidate networks. An important study by Smith et al. (2011) illustrated that many Bayesian network approaches did not perform well in identifying the directionality of connections in simulated single-subject data. Since then, new Bayesian network approaches have been developed that have overcome the failures in the Smith work. Additionally, an important discovery was made that shows a preprocessing step used in the Smith data puts some of the Bayesian network methods at a disadvantage. This work provides a review of Bayesian network analyses, focusing on the methods used in the Smith work as well as methods developed since 2011 that have improved estimation performance. Importantly, only approaches that have been specifically designed for fMRI data perform well, as they have been tailored to meet the challenges of fMRI data. Although this work does not suggest a single best model, it describes the class of models that perform best and highlights the features of these models that allow them to perform well on fMRI data. Specifically, methods that rely on non-Gaussianity to direct causal relationships in the network perform well.

  1. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  2. 恶性肿瘤患者术后化疗期创伤后成长状况调查%Survey on post traumatic growth of malignant tumor patients receiving post-operative chemotherapy

    Institute of Scientific and Technical Information of China (English)

    汪娟; 张平; 宋旭红; 李晓燕

    2012-01-01

    Objective io understand post traumatic growth 01 malignant tumor patients receiving post-operative chemotherapy, and provide evidence for nurses to adopt psychological care to patients with malignant tumor. Methods A total of 230 malignant tumor patients receiving post-operative chemotherapy were investigated in terms of their demographic data and post traumatic growth. Results The total score of Post Traumatic Growth Inventory (PTGD of malignant tumor patients was 67. 33 + 14. 17, with the score of Appreciation of Life being the highest, followed by Spiritual Change, Relating to Others, Personal Strength, and New Possibilities. PTGI score had significant differences between genders, age groups, marital status, residential places, disease conditions, and among varied monthly incomes and courses of disease (P<0. 05, P<0. 01). Conclusion Post traumatic growth in malignant tumor patients receiving post-operative chemotherapy is at moderate level. Gender, ages, social support and condition of illness are precipitating factors of PTG. Nurses should be able to recognize positive mental changes among malignant tumor patients and take individualized psychological measures.%目的 了解恶性肿瘤患者术后化疗期创伤后成长状况,为恶性肿瘤患者术后的心理干预提供依据.方法 对230例恶性肿瘤术后化疗期患者进行一般情况和创伤后成长状况调查,并进行统计分析.结果 恶性肿瘤术后化疗期患者创伤后成长总均分为67.33±14.17,欣赏生活维度条目均分最高,其次是精神改变、人际关系、个人增强,新的可能性均分最低.不同性别、年龄、婚姻状况、居住状况、月收入、病情、病程等患者的创伤后成长评分比较,差异有统计学意义(P<0.05,P<0.01).结论 恶性肿瘤术后化疗期患者呈中等程度的创伤后成长水平,性别、年龄、社会支持、病情等为其影响因素.肿瘤患者的护理过程中,要看到患者心理的积极改变,

  3. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  4. Anomaly Detection and Attribution Using Bayesian Networks

    Science.gov (United States)

    2014-06-01

    UNCLASSIFIED Anomaly Detection and Attribution Using Bayesian Networks Andrew Kirk, Jonathan Legg and Edwin El-Mahassni National Security and...detection in Bayesian networks , en- abling both the detection and explanation of anomalous cases in a dataset. By exploiting the structure of a... Bayesian network , our algorithm is able to efficiently search for local maxima of data conflict between closely related vari- ables. Benchmark tests using

  5. Revisiting the effectiveness of interventions to decrease surgical site infections in colorectal surgery: A Bayesian perspective.

    Science.gov (United States)

    Phatak, Uma R; Pedroza, Claudia; Millas, Stefanos G; Chang, George J; Lally, Kevin P; Kao, Lillian S

    2012-08-01

    To evaluate the evidence for interventions to decrease surgical site infections (SSIs) in colorectal operations using Bayesian meta-analysis. Interventions other than appropriate administration of prophylactic antibiotics to prevent SSIs have not been adopted widely, in part because of lack of recommendations for these interventions based on traditional meta-analyses. Bayesian methods can provide probabilities of specific thresholds of benefit, which may be more useful in guiding clinical decision making. We hypothesized that Bayesian meta-analytic methods would complement the interpretation of traditional analyses regarding the effectiveness of interventions to decrease SSIs. We conducted a systematic search of the Cochrane database for reviews of interventions to decrease SSIs after colorectal surgery other than prophylactic antibiotics. Traditional and Bayesian meta-analyses were performed using RevMan (Nordic Cochrane Center, Copenhagen, Denmark) and WinBUGS (MRC Biostatistics Unit, Cambridge, UK). Bayesian posterior probabilities of any benefit, defined as a relative risk of Bayesian analysis, several interventions that did not result in "significant" decreases in SSIs using traditional analytic methods had a >85% probability of benefit. Also, nonuse of 2 interventions (mechanical bowel preparation and adhesive drapes) had a high probability of decreasing SSIs compared with their use. Bayesian probabilities and traditional point estimates of treatment effect yield similar information in terms of potential effectiveness. Bayesian meta-analysis, however, provides complementary information on the probability of a large magnitude of effect. The clinical impact of using Bayesian methods to inform decisions about which interventions to institute first or which interventions to combine requires further study. Copyright © 2012 Mosby, Inc. All rights reserved.

  6. Discriminating Threshold of Driving Fatigue Based on the Electroencephalography Sample Entropy by Receiver Operating Characteristic Curve Analysis%基于ROC曲线的驾驶疲劳脑电样本熵判定阈值研究

    Institute of Scientific and Technical Information of China (English)

    赵晓华; 许士丽; 荣建; 张兴俭

    2013-01-01

    为了获得客观而准确的驾驶疲劳判别阈值,采用驾驶模拟实验研究方法,采集驾驶员在清醒及疲劳状态下的脑电信号,对比分析不同状态下脑电信号的时域特征,选取表征信号复杂程度的样本熵作为驾驶疲劳判别指标,并利用受试者工作特性曲线(receiver operating characteristic curve,ROC)分析方法,确定基于脑电信号样本熵值的驾驶疲劳判别阈值.研究结果表明:脑电信号样本熵值处于区间(0.32,0.71)时,驾驶员处于疲劳过渡时期,可能出现疲劳特征;脑电信号样本熵值小于阈值0.605时,判定驾驶员处于驾驶疲劳状态,准确率为0.95,该值可作为基于脑电信号样本熵的驾驶疲劳判定阈值.%In order to acquire an objective and accurate driving fatigue threshold, electroencephalography (EEG) signals of drivers were collected from driving simulator, and the time-domain characteristics of EGG signals of drivers in sober and mental fatigue states were comparatively analyzed. Considering the different complexity of EEG signals in sober and fatigue states, the sample entropy of EEG signals were calculated to characterize the complexity of signals, and used as the index for identifying driving fatigue. Based on the obtained EGG sample entropy, the receiver operating characteristic (ROC) curve analysis was introduced to obtain the discriminating threshold of driving fatigue. The results indicate that when the EEG sample entropy value is between (0.32, 0.71) , the driver is in the transitional period of fatigue, may be in a fatigue state; the sample entropy of less than 0.605 can be identified as the threshold of driving fatigue, and the accuracy is 0.95.

  7. 49 CFR 393.88 - Television receivers.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Television receivers. 393.88 Section 393.88... NECESSARY FOR SAFE OPERATION Miscellaneous Parts and Accessories § 393.88 Television receivers. Any motor vehicle equipped with a television viewer, screen or other means of visually receiving a...

  8. PedExpert: a computer program for the application of Bayesian networks to human paternity testing.

    Science.gov (United States)

    Gomes, R R; Campos, S V A; Pena, S D J

    2009-01-01

    PedExpert is a Windows-based Bayesian network software, especially constructed to solve problems in parentage testing that are complex because of missing genetic information on the alleged father and/or because they involve genetic mutations. PedExpert automates the creation and manipulation of Bayesian networks, implementing algorithms that convert pedigrees and sets of indispensable information (genotypes, allele frequencies, mutation rates) into Bayesian networks. This program has a novel feature that can incorporate information about gene mutations into tables of conditional probabilities of transmission of alleles from the alleged father to the child, without adding new nodes to the network. This permits using the same Bayesian network in different modes, for analysis of cases that include mutations or not. PedExpert is user-friendly and greatly reduces the time of analysis for complex cases of paternity testing, eliminating most sources of logical and operational error.

  9. 3rd Bayesian Young Statisticians Meeting

    CERN Document Server

    Lanzarone, Ettore; Villalobos, Isadora; Mattei, Alessandra

    2017-01-01

    This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).

  10. Variational bayesian method of estimating variance components.

    Science.gov (United States)

    Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi

    2016-07-01

    We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.

  11. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learn....... An automated procedure for specifying prior distributions for the parameters in a dynamic Bayesian network is presented. It is a simple extension of the procedure for the ordinary Bayesian networks. Finally the W¨olfer?s sunspot numbers are analyzed....

  12. Soft-In Soft-Output Detection in the Presence of Parametric Uncertainty via the Bayesian EM Algorithm

    Directory of Open Access Journals (Sweden)

    Gallo A. S.

    2005-01-01

    Full Text Available We investigate the application of the Bayesian expectation-maximization (BEM technique to the design of soft-in soft-out (SISO detection algorithms for wireless communication systems operating over channels affected by parametric uncertainty. First, the BEM algorithm is described in detail and its relationship with the well-known expectation-maximization (EM technique is explained. Then, some of its applications are illustrated. In particular, the problems of SISO detection of spread spectrum, single-carrier and multicarrier space-time block coded signals are analyzed. Numerical results show that BEM-based detectors perform closely to the maximum likelihood (ML receivers endowed with perfect channel state information as long as channel variations are not too fast.

  13. A Bayesian Model to Predict Right Ventricular Failure Following Left Ventricular Assist Device Therapy.

    Science.gov (United States)

    Loghmanpour, Natasha A; Kormos, Robert L; Kanwar, Manreet K; Teuteberg, Jeffrey J; Murali, Srinivas; Antaki, James F

    2016-09-01

    This study investigates the use of a Bayesian statistical model to address the limited predictive capacity of existing risk scores derived from multivariate analyses. This is based on the hypothesis that it is necessary to consider the interrelationships and conditional probabilities among independent variables to achieve sufficient statistical accuracy. Right ventricular failure (RVF) continues to be a major adverse event following left ventricular assist device (LVAD) implantation. Data used for this study were derived from 10,909 adult patients from the Inter-Agency Registry for Mechanically Assisted Circulatory Support (INTERMACS) who had a primary LVAD implanted between December 2006 and March 2014. An initial set of 176 pre-implantation variables were considered. RVF post-implant was categorized as acute (14 days) in onset. For each of these endpoints, a separate tree-augmented naïve Bayes model was constructed using the most predictive variables employing an open source Bayesian inference engine. The acute RVF model consisted of 33 variables including systolic pulmonary artery pressure (PAP), white blood cell count, left ventricular ejection fraction, cardiac index, sodium levels, and lymphocyte percentage. The early RVF model consisted of 34 variables, including systolic PAP, pre-albumin, lactate dehydrogenase level, INTERMACS profile, right ventricular ejection fraction, pro-B-type natriuretic peptide, age, heart rate, tricuspid regurgitation, and body mass index. The late RVF model included 33 variables and was predicted mostly by peripheral vascular resistance, model for end-stage liver disease score, albumin level, lymphocyte percentage, and mean and diastolic PAP. The accuracy of all Bayesian models was between 91% and 97%, with an area under the receiver operator characteristics curve between 0.83 and 0.90, sensitivity of 90%, and specificity between 98% and 99%, significantly outperforming previously published risk scores. A Bayesian prognostic

  14. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  15. Assessment of Breast Cancer Risk in an Iranian Female Population Using Bayesian Networks with Varying Node Number

    Science.gov (United States)

    Rezaianzadeh, Abbas; Sepandi, Mojtaba; Rahimikazerooni, Salar

    2016-11-01

    Objective: As a source of information, medical data can feature hidden relationships. However, the high volume of datasets and complexity of decision-making in medicine introduce difficulties for analysis and interpretation and processing steps may be needed before the data can be used by clinicians in their work. This study focused on the use of Bayesian models with different numbers of nodes to aid clinicians in breast cancer risk estimation. Methods: Bayesian networks (BNs) with a retrospectively collected dataset including mammographic details, risk factor exposure, and clinical findings was assessed for prediction of the probability of breast cancer in individual patients. Area under the receiver-operating characteristic curve (AUC), accuracy, sensitivity, specificity, and positive and negative predictive values were used to evaluate discriminative performance. Result: A network incorporating selected features performed better (AUC = 0.94) than that incorporating all the features (AUC = 0.93). The results revealed no significant difference among 3 models regarding performance indices at the 5% significance level. Conclusion: BNs could effectively discriminate malignant from benign abnormalities and accurately predict the risk of breast cancer in individuals. Moreover, the overall performance of the 9-node BN was better, and due to the lower number of nodes it might be more readily be applied in clinical settings.

  16. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  17. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  18. Bayesian homeopathy: talking normal again.

    Science.gov (United States)

    Rutten, A L B

    2007-04-01

    Homeopathy has a communication problem: important homeopathic concepts are not understood by conventional colleagues. Homeopathic terminology seems to be comprehensible only after practical experience of homeopathy. The main problem lies in different handling of diagnosis. In conventional medicine diagnosis is the starting point for randomised controlled trials to determine the effect of treatment. In homeopathy diagnosis is combined with other symptoms and personal traits of the patient to guide treatment and predict response. Broadening our scope to include diagnostic as well as treatment research opens the possibility of multi factorial reasoning. Adopting Bayesian methodology opens the possibility of investigating homeopathy in everyday practice and of describing some aspects of homeopathy in conventional terms.

  19. Approximation for Bayesian Ability Estimation.

    Science.gov (United States)

    1987-02-18

    posterior pdfs of ande are given by p(-[Y) p(F) F P((y lei’ j)P )d. SiiJ i (4) a r~d p(e Iy) - p(t0) 1 J i P(Yij ei, (5) As shown in Tsutakawa and Lin...inverse A Hessian of the log of (27) with respect to , evaulatedat a Then, under regularity conditions, the marginal posterior pdf of O is...two-way contingency tables. Journal of Educational Statistics, 11, 33-56. Lindley, D.V. (1980). Approximate Bayesian methods. Trabajos Estadistica , 31

  20. Bayesian Query-Focused Summarization

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.

  1. Bayesian tests of measurement invariance.

    Science.gov (United States)

    Verhagen, A J; Fox, J P

    2013-11-01

    Random item effects models provide a natural framework for the exploration of violations of measurement invariance without the need for anchor items. Within the random item effects modelling framework, Bayesian tests (Bayes factor, deviance information criterion) are proposed which enable multiple marginal invariance hypotheses to be tested simultaneously. The performance of the tests is evaluated with a simulation study which shows that the tests have high power and low Type I error rate. Data from the European Social Survey are used to test for measurement invariance of attitude towards immigrant items and to show that background information can be used to explain cross-national variation in item functioning.

  2. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  3. Thermal resistance model for CSP central receivers

    Science.gov (United States)

    de Meyer, O. A. J.; Dinter, F.; Govender, S.

    2016-05-01

    The receiver design and heliostat field aiming strategy play a vital role in the heat transfer efficiency of the receiver. In molten salt external receivers, the common operating temperature of the heat transfer fluid or molten salt ranges between 285°C to 565°C. The optimum output temperature of 565°C is achieved by adjusting the mass flow rate of the molten salt through the receiver. The reflected solar radiation onto the receiver contributes to the temperature rise in the molten salt by means of heat transfer. By investigating published work on molten salt external receiver operating temperatures, corresponding receiver tube surface temperatures and heat losses, a model has been developed to obtain a detailed thermographic representation of the receiver. The steady state model uses a receiver flux map as input to determine: i) heat transfer fluid mass flow rate through the receiver to obtain the desired molten salt output temperature of 565°C, ii) receiver surface temperatures iii) receiver tube temperatures iv) receiver efficiency v) pressure drop across the receiver and vi) corresponding tube strain per panel.

  4. ANUBIS: artificial neuromodulation using a Bayesian inference system.

    Science.gov (United States)

    Smith, Benjamin J H; Saaj, Chakravarthini M; Allouis, Elie

    2013-01-01

    Gain tuning is a crucial part of controller design and depends not only on an accurate understanding of the system in question, but also on the designer's ability to predict what disturbances and other perturbations the system will encounter throughout its operation. This letter presents ANUBIS (artificial neuromodulation using a Bayesian inference system), a novel biologically inspired technique for automatically tuning controller parameters in real time. ANUBIS is based on the Bayesian brain concept and modifies it by incorporating a model of the neuromodulatory system comprising four artificial neuromodulators. It has been applied to the controller of EchinoBot, a prototype walking rover for Martian exploration. ANUBIS has been implemented at three levels of the controller; gait generation, foot trajectory planning using Bézier curves, and foot trajectory tracking using a terminal sliding mode controller. We compare the results to a similar system that has been tuned using a multilayer perceptron. The use of Bayesian inference means that the system retains mathematical interpretability, unlike other intelligent tuning techniques, which use neural networks, fuzzy logic, or evolutionary algorithms. The simulation results show that ANUBIS provides significant improvements in efficiency and adaptability of the three controller components; it allows the robot to react to obstacles and uncertainties faster than the system tuned with the MLP, while maintaining stability and accuracy. As well as advancing rover autonomy, ANUBIS could also be applied to other situations where operating conditions are likely to change or cannot be accurately modeled in advance, such as process control. In addition, it demonstrates one way in which neuromodulation could fit into the Bayesian brain framework.

  5. Ensemble Bayesian forecasting system Part I: Theory and algorithms

    Science.gov (United States)

    Herr, Henry D.; Krzysztofowicz, Roman

    2015-05-01

    The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of

  6. A Bayesian Approach for Nonlinear Structural Equation Models with Dichotomous Variables Using Logit and Probit Links

    Science.gov (United States)

    Lee, Sik-Yum; Song, Xin-Yuan; Cai, Jing-Heng

    2010-01-01

    Analysis of ordered binary and unordered binary data has received considerable attention in social and psychological research. This article introduces a Bayesian approach, which has several nice features in practical applications, for analyzing nonlinear structural equation models with dichotomous data. We demonstrate how to use the software…

  7. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  8. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  9. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  10. Bayesian analysis of exoplanet and binary orbits

    OpenAIRE

    Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas

    2012-01-01

    We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.

  11. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  12. Bayesian Decision Theoretical Framework for Clustering

    Science.gov (United States)

    Chen, Mo

    2011-01-01

    In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…

  13. Computational Advances for and from Bayesian Analysis

    OpenAIRE

    Andrieu, C.; Doucet, A.; Robert, C. P.

    2004-01-01

    The emergence in the past years of Bayesian analysis in many methodological and applied fields as the solution to the modeling of complex problems cannot be dissociated from major changes in its computational implementation. We show in this review how the advances in Bayesian analysis and statistical computation are intermingled.

  14. Particle identification in ALICE: a Bayesian approach

    NARCIS (Netherlands)

    Adam, J.; Adamova, D.; Aggarwal, M. M.; Rinella, G. Aglieri; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Anticic, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshaeuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badala, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnafoeldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Bathen, B.; Batigne, G.; Camejo, A. Batista; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielcik, J.; Bielcikova, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boggild, H.; Boldizsar, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossu, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Diaz, L. Calero; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castellanos, J. Castillo; Castro, A. J.; Casula, E. A. R.; Sanchez, C. Ceballos; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Barroso, V. Chibante; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Balbastre, G. Conesa; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Morales, Y. Corrales; Cortes Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Denes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divia, R.; Djuvsland, O.; Dobrin, A.; Gimenez, D. Domenicis; Doenigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernandez Tellez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhoje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glaessel, P.; Gomez Coral, D. M.; Ramirez, A. Gomez; Gonzalez, A. S.; Gonzalez, V.; Gonzalez-Zamora, P.; Gorbunov, S.; Goerlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Haake, R.; Haaland, O.

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian

  15. Bayesian credible interval construction for Poisson statistics

    Institute of Scientific and Technical Information of China (English)

    ZHU Yong-Sheng

    2008-01-01

    The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.

  16. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  17. The Bayesian Revolution Approaches Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2007-01-01

    This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…

  18. Advances in Bayesian Modeling in Educational Research

    Science.gov (United States)

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  19. Bayesian Network for multiple hypthesis tracking

    NARCIS (Netherlands)

    W.P. Zajdel; B.J.A. Kröse

    2002-01-01

    For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a produ

  20. Bayesian Statistics for Biological Data: Pedigree Analysis

    Science.gov (United States)

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  1. Refinement of Bayesian Network Structures upon New Data

    DEFF Research Database (Denmark)

    Zeng, Yifeng; Xiang, Yanping; Pacekajus, Saulius

    2010-01-01

    Refinement of Bayesian network (BN) structures using new data becomes more and more relevant. Some work has been done there; however, one problem has not been considered yet – what to do when new data have fewer or more attributes than the existing model. In both cases, data contain important...... knowledge and every effort must be made in order to extract it. In this paper, we propose a general merging algorithm to deal with situations when new data have different set of attributes. The merging algorithm updates sufficient statistics when new data are received. It expands the flexibility of BN...

  2. Highly Sensitive Optical Receivers

    CERN Document Server

    Schneider, Kerstin

    2006-01-01

    Highly Sensitive Optical Receivers primarily treats the circuit design of optical receivers with external photodiodes. Continuous-mode and burst-mode receivers are compared. The monograph first summarizes the basics of III/V photodetectors, transistor and noise models, bit-error rate, sensitivity and analog circuit design, thus enabling readers to understand the circuits described in the main part of the book. In order to cover the topic comprehensively, detailed descriptions of receivers for optical data communication in general and, in particular, optical burst-mode receivers in deep-sub-µm CMOS are presented. Numerous detailed and elaborate illustrations facilitate better understanding.

  3. Adaptive Methods within a Sequential Bayesian Approach for Structural Health Monitoring

    Science.gov (United States)

    Huff, Daniel W.

    Structural integrity is an important characteristic of performance for critical components used in applications such as aeronautics, materials, construction and transportation. When appraising the structural integrity of these components, evaluation methods must be accurate. In addition to possessing capability to perform damage detection, the ability to monitor the level of damage over time can provide extremely useful information in assessing the operational worthiness of a structure and in determining whether the structure should be repaired or removed from service. In this work, a sequential Bayesian approach with active sensing is employed for monitoring crack growth within fatigue-loaded materials. The monitoring approach is based on predicting crack damage state dynamics and modeling crack length observations. Since fatigue loading of a structural component can change while in service, an interacting multiple model technique is employed to estimate probabilities of different loading modes and incorporate this information in the crack length estimation problem. For the observation model, features are obtained from regions of high signal energy in the time-frequency plane and modeled for each crack length damage condition. Although this observation model approach exhibits high classification accuracy, the resolution characteristics can change depending upon the extent of the damage. Therefore, several different transmission waveforms and receiver sensors are considered to create multiple modes for making observations of crack damage. Resolution characteristics of the different observation modes are assessed using a predicted mean squared error criterion and observations are obtained using the predicted, optimal observation modes based on these characteristics. Calculation of the predicted mean square error metric can be computationally intensive, especially if performed in real time, and an approximation method is proposed. With this approach, the real time

  4. Development of a Bayesian Belief Network Model for personalized prognostic risk assessment in colon carcinomatosis.

    Science.gov (United States)

    Stojadinovic, Alexander; Nissan, Aviram; Eberhardt, John; Chua, Terence C; Pelz, Joerg O W; Esquivel, Jesus

    2011-02-01

    Multimodality therapy in selected patients with peritoneal carcinomatosis is gaining acceptance. Treatment-directing decision support tools are needed to individualize care and select patients best suited for cytoreductive surgery +/- hyperthermic intraperitoneal chemotherapy (CRS +/- HIPEC). The purpose of this study is to develop a predictive model that could support surgical decisions in patients with colon carcinomatosis. Fifty-three patients were enrolled in a prospective study collecting 31 clinical-pathological, treatment-related, and outcome data. The population was characterized by disease presentation, performance status, extent of peritoneal cancer (Peritoneal Cancer Index, PCI), primary tumor histology, and nodal staging. These preoperative parameters were analyzed using step-wise machine-learned Bayesian Belief Networks (BBN) to develop a predictive model for overall survival (OS) in patients considered for CRS +/- HIPEC. Area-under-the-curve from receiver-operating-characteristics curves of OS predictions was calculated to determine the model's positive and negative predictive value. Model structure defined three predictors of OS: severity of symptoms (performance status), PCI, and ability to undergo CRS +/- HIPEC. Patients with PCI 20, who were not considered surgical candidates. Cross validation of the BBN model robustly classified OS (area-under-the-curve = 0.71). The model's positive predictive value and negative predictive value are 63.3 per cent and 68.3 per cent, respectively. This exploratory study supports the utility of Bayesian classification for developing decision support tools, which assess case-specific relative risk for a given patient for oncological outcomes based on clinically relevant classifiers of survival. Further prospective studies to validate the BBN model-derived prognostic assessment tool are warranted.

  5. Hepatitis disease detection using Bayesian theory

    Science.gov (United States)

    Maseleno, Andino; Hidayati, Rohmah Zahroh

    2017-02-01

    This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.

  6. Bayesian natural language semantics and pragmatics

    CERN Document Server

    Zeevat, Henk

    2015-01-01

    The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice's contributions to pragmatics or in interpretation by abduction.

  7. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  8. Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations

    Science.gov (United States)

    Chen, Peng; Schwab, Christoph

    2016-07-01

    We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov-Galerkin high-fidelity (;HiFi;) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by the so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data assimilation

  9. Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Peng, E-mail: peng@ices.utexas.edu [The Institute for Computational Engineering and Sciences, The University of Texas at Austin, 201 East 24th Street, Stop C0200, Austin, TX 78712-1229 (United States); Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch [Seminar für Angewandte Mathematik, Eidgenössische Technische Hochschule, Römistrasse 101, CH-8092 Zürich (Switzerland)

    2016-07-01

    We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by the so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data

  10. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.

  11. Flexible Bayesian Human Fecundity Models.

    Science.gov (United States)

    Kim, Sungduk; Sundaram, Rajeshwari; Buck Louis, Germaine M; Pyper, Cecilia

    2012-12-01

    Human fecundity is an issue of considerable interest for both epidemiological and clinical audiences, and is dependent upon a couple's biologic capacity for reproduction coupled with behaviors that place a couple at risk for pregnancy. Bayesian hierarchical models have been proposed to better model the conception probabilities by accounting for the acts of intercourse around the day of ovulation, i.e., during the fertile window. These models can be viewed in the framework of a generalized nonlinear model with an exponential link. However, a fixed choice of link function may not always provide the best fit, leading to potentially biased estimates for probability of conception. Motivated by this, we propose a general class of models for fecundity by relaxing the choice of the link function under the generalized nonlinear model framework. We use a sample from the Oxford Conception Study (OCS) to illustrate the utility and fit of this general class of models for estimating human conception. Our findings reinforce the need for attention to be paid to the choice of link function in modeling conception, as it may bias the estimation of conception probabilities. Various properties of the proposed models are examined and a Markov chain Monte Carlo sampling algorithm was developed for implementing the Bayesian computations. The deviance information criterion measure and logarithm of pseudo marginal likelihood are used for guiding the choice of links. The supplemental material section contains technical details of the proof of the theorem stated in the paper, and contains further simulation results and analysis.

  12. Bayesian modeling in conjoint analysis

    Directory of Open Access Journals (Sweden)

    Janković-Milić Vesna

    2010-01-01

    Full Text Available Statistical analysis in marketing is largely influenced by the availability of various types of data. There is sudden increase in the number and types of information available to market researchers in the last decade. In such conditions, traditional statistical methods have limited ability to solve problems related to the expression of market uncertainty. The aim of this paper is to highlight the advantages of bayesian inference, as an alternative approach to classical inference. Multivariate statistic methods offer extremely powerful tools to achieve many goals of marketing research. One of these methods is the conjoint analysis, which provides a quantitative measure of the relative importance of product or service attributes in relation to the other attribute. The application of this method involves interviewing consumers, where they express their preferences, and statistical analysis provides numerical indicators of each attribute utility. One of the main objections to the method of discrete choice in the conjoint analysis is to use this method to estimate the utility only at the aggregate level and by expressing the average utility for all respondents in the survey. Application of hierarchical Bayesian models enables capturing of individual utility ratings for each attribute level.

  13. Multi-reader multi-case studies using the area under the receiver operator characteristic curve as a measure of diagnostic accuracy: systematic review with a focus on quality of data reporting.

    Directory of Open Access Journals (Sweden)

    Thaworn Dendumrongsup

    Full Text Available We examined the design, analysis and reporting in multi-reader multi-case (MRMC research studies using the area under the receiver-operating curve (ROC AUC as a measure of diagnostic performance.We performed a systematic literature review from 2005 to 2013 inclusive to identify a minimum 50 studies. Articles of diagnostic test accuracy in humans were identified via their citation of key methodological articles dealing with MRMC ROC AUC. Two researchers in consensus then extracted information from primary articles relating to study characteristics and design, methods for reporting study outcomes, model fitting, model assumptions, presentation of results, and interpretation of findings. Results were summarized and presented with a descriptive analysis.Sixty-four full papers were retrieved from 475 identified citations and ultimately 49 articles describing 51 studies were reviewed and extracted. Radiological imaging was the index test in all. Most studies focused on lesion detection vs. characterization and used less than 10 readers. Only 6 (12% studies trained readers in advance to use the confidence scale used to build the ROC curve. Overall, description of confidence scores, the ROC curve and its analysis was often incomplete. For example, 21 (41% studies presented no ROC curve and only 3 (6% described the distribution of confidence scores. Of 30 studies presenting curves, only 4 (13% presented the data points underlying the curve, thereby allowing assessment of extrapolation. The mean change in AUC was 0.05 (-0.05 to 0.28. Non-significant change in AUC was attributed to underpowering rather than the diagnostic test failing to improve diagnostic accuracy.Data reporting in MRMC studies using ROC AUC as an outcome measure is frequently incomplete, hampering understanding of methods and the reliability of results and study conclusions. Authors using this analysis should be encouraged to provide a full description of their methods and results.

  14. Differentiation between microcystin contaminated and uncontaminated fish by determination of unconjugated MCs using an ELISA anti-Adda test based on receiver-operating characteristic curves threshold values: application to Tinca tinca from natural ponds.

    Science.gov (United States)

    Moreno, Isabel María; Herrador, M Ángeles; Atencio, Loyda; Puerto, María; González, A Gustavo; Cameán, Ana María

    2011-02-01

    The aim of this study was to evaluate whether the enzyme-linked immunosorbent assay (ELISA) anti-Adda technique could be used to monitor free microcystins (MCs) in biological samples from fish naturally exposed to toxic cyanobacteria by using receiver operating characteristic (ROC) curve software to establish an optimal cut-off value for MCs. The cut-off value determined by ROC curve analysis in tench (Tinca tinca) exposed to MCs under laboratory conditions by ROC curve analysis was 5.90-μg MCs/kg tissue dry weight (d.w.) with a sensitivity of 93.3%. This value was applied in fish samples from natural ponds (Extremadura, Spain) in order to asses its potential MCs bioaccumulation by classifying samples as either true positive (TP), false positive (FP), true negative (TN), or false negative (FN). In this work, it has been demonstrated that toxic cyanobacteria, mainly Microcystis aeruginosa, Aphanizomenon issatchenkoi, and Anabaena spiroides, were present in two of these ponds, Barruecos de Abajo (BDown) and Barruecos de Arriba (BUp). The MCs levels were detected in waters from both ponds with an anti-MC-LR ELISA immunoassay and were of similar values (between 3.8-6.5-μg MC-LR equivalent/L in BDown pond and 4.8-6.0-μg MC-LR equivalent/L in BUp). The MCs cut-off values were applied in livers from fish collected from these two ponds using the ELISA anti-Adda technique. A total of 83% of samples from BDown pond and only 42% from BUp were TP with values of free MCs higher than 8.8-μg MCs/kg tissue (d.w.).

  15. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    Science.gov (United States)

    Sudhan Reddy Gudur, Madhu; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-11-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10-4), 283 for the intensity approach (p = 2  ×  10-6) and 282 without density

  16. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning.

    Science.gov (United States)

    Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-11-07

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm's accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10(-4)), 283 for the intensity approach (p = 2  ×  10(-6)) and 282 without density

  17. Bayesian Action&Perception: Representing the World in the Brain

    Directory of Open Access Journals (Sweden)

    Gerald E. Loeb

    2014-10-01

    Full Text Available Theories of perception seek to explain how sensory data are processed to identify previously experienced objects, but they usually do not consider the decisions and effort that goes into acquiring the sensory data. Identification of objects according to their tactile properties requires active exploratory movements. The sensory data thereby obtained depend on the details of those movements, which human subjects change rapidly and seemingly capriciously. Bayesian Exploration is an algorithm that uses prior experience to decide which next exploratory movement should provide the most useful data to disambiguate the most likely possibilities. In previous studies, a simple robot equipped with a biomimetic tactile sensor and operated according to Bayesian Exploration performed in a manner similar to and actually better than humans on a texture identification task. Expanding on this, Bayesian Action&Perception refers to the construction and querying of an associative memory of previously experienced entities containing both sensory data and the motor programs that elicited them. We hypothesize that this memory can be queried i to identify useful next exploratory movements during identification of an unknown entity (action for perception or ii to characterize whether an unknown entity is fit for purpose (perception for action or iii to recall what actions might be feasible for a known entity (Gibsonian affordance. The biomimetic design of this mechatronic system may provide insights into the neuronal basis of biological action and perception.

  18. Low complexity MIMO receivers

    CERN Document Server

    Bai, Lin; Yu, Quan

    2014-01-01

    Multiple-input multiple-output (MIMO) systems can increase the spectral efficiency in wireless communications. However, the interference becomes the major drawback that leads to high computational complexity at both transmitter and receiver. In particular, the complexity of MIMO receivers can be prohibitively high. As an efficient mathematical tool to devise low complexity approaches that mitigate the interference in MIMO systems, lattice reduction (LR) has been widely studied and employed over the last decade. The co-authors of this book are world's leading experts on MIMO receivers, and here they share the key findings of their research over years. They detail a range of key techniques for receiver design as multiple transmitted and received signals are available. The authors first introduce the principle of signal detection and the LR in mathematical aspects. They then move on to discuss the use of LR in low complexity MIMO receiver design with respect to different aspects, including uncoded MIMO detection...

  19. Delphi Accounts Receivable Module -

    Data.gov (United States)

    Department of Transportation — Delphi accounts receivable module contains the following data elements, but are not limited to customer information, cash receipts, line of accounting details, bill...

  20. A Comparison of Hierarchical and Non-Hierarchical Bayesian Approaches for Fitting Allometric Larch (Larix.spp. Biomass Equations

    Directory of Open Access Journals (Sweden)

    Dongsheng Chen

    2016-01-01

    Full Text Available Accurate biomass estimations are important for assessing and monitoring forest carbon storage. Bayesian theory has been widely applied to tree biomass models. Recently, a hierarchical Bayesian approach has received increasing attention for improving biomass models. In this study, tree biomass data were obtained by sampling 310 trees from 209 permanent sample plots from larch plantations in six regions across China. Non-hierarchical and hierarchical Bayesian approaches were used to model allometric biomass equations. We found that the total, root, stem wood, stem bark, branch and foliage biomass model relationships were statistically significant (p-values < 0.001 for both the non-hierarchical and hierarchical Bayesian approaches, but the hierarchical Bayesian approach increased the goodness-of-fit statistics over the non-hierarchical Bayesian approach. The R2 values of the hierarchical approach were higher than those of the non-hierarchical approach by 0.008, 0.018, 0.020, 0.003, 0.088 and 0.116 for the total tree, root, stem wood, stem bark, branch and foliage models, respectively. The hierarchical Bayesian approach significantly improved the accuracy of the biomass model (except for the stem bark and can reflect regional differences by using random parameters to improve the regional scale model accuracy.

  1. Bayesian approach to noninferiority trials for proportions.

    Science.gov (United States)

    Gamalo, Mark A; Wu, Rui; Tiwari, Ram C

    2011-09-01

    Noninferiority trials are unique because they are dependent upon historical information in order to make meaningful interpretation of their results. Hence, a direct application of the Bayesian paradigm in sequential learning becomes apparently useful in the analysis. This paper describes a Bayesian procedure for testing noninferiority in two-arm studies with a binary primary endpoint that allows the incorporation of historical data on an active control via the use of informative priors. In particular, the posteriors of the response in historical trials are assumed as priors for its corresponding parameters in the current trial, where that treatment serves as the active control. The Bayesian procedure includes a fully Bayesian method and two normal approximation methods on the prior and/or on the posterior distributions. Then a common Bayesian decision criterion is used but with two prespecified cutoff levels, one for the approximation methods and the other for the fully Bayesian method, to determine whether the experimental treatment is noninferior to the active control. This criterion is evaluated and compared with the frequentist method using simulation studies in keeping with regulatory framework that new methods must protect type I error and arrive at a similar conclusion with existing standard strategies. Results show that both methods arrive at comparable conclusions of noninferiority when applied to a modified real data set. The advantage of the proposed Bayesian approach lies in its ability to provide posterior probabilities for effect sizes of the experimental treatment over the active control.

  2. Study of TEC fluctuation via stochastic models and Bayesian inversion

    Science.gov (United States)

    Bires, A.; Roininen, L.; Damtie, B.; Nigussie, M.; Vanhamäki, H.

    2016-11-01

    We propose stochastic processes to be used to model the total electron content (TEC) observation. Based on this, we model the rate of change of TEC (ROT) variation during ionospheric quiet conditions with stationary processes. During ionospheric disturbed conditions, for example, when irregularity in ionospheric electron density distribution occurs, stationarity assumption over long time periods is no longer valid. In these cases, we make the parameter estimation for short time scales, during which we can assume stationarity. We show the relationship between the new method and commonly used TEC characterization parameters ROT and the ROT Index (ROTI). We construct our parametric model within the framework of Bayesian statistical inverse problems and hence give the solution as an a posteriori probability distribution. Bayesian framework allows us to model measurement errors systematically. Similarly, we mitigate variation of TEC due to factors which are not of ionospheric origin, like due to the motion of satellites relative to the receiver, by incorporating a priori knowledge in the Bayesian model. In practical computations, we draw the so-called maximum a posteriori estimates, which are our ROT and ROTI estimates, from the posterior distribution. Because the algorithm allows to estimate ROTI at each observation time, the estimator does not depend on the period of time for ROTI computation. We verify the method by analyzing TEC data recorded by GPS receiver located in Ethiopia (11.6°N, 37.4°E). The results indicate that the TEC fluctuations caused by the ionospheric irregularity can be effectively detected and quantified from the estimated ROT and ROTI values.

  3. Aggregated Residential Load Modeling Using Dynamic Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Vlachopoulou, Maria; Chin, George; Fuller, Jason C.; Lu, Shuai

    2014-09-28

    Abstract—It is already obvious that the future power grid will have to address higher demand for power and energy, and to incorporate renewable resources of different energy generation patterns. Demand response (DR) schemes could successfully be used to manage and balance power supply and demand under operating conditions of the future power grid. To achieve that, more advanced tools for DR management of operations and planning are necessary that can estimate the available capacity from DR resources. In this research, a Dynamic Bayesian Network (DBN) is derived, trained, and tested that can model aggregated load of Heating, Ventilation, and Air Conditioning (HVAC) systems. DBNs can provide flexible and powerful tools for both operations and planing, due to their unique analytical capabilities. The DBN model accuracy and flexibility of use is demonstrated by testing the model under different operational scenarios.

  4. HIGH-EFFICIENCY INFRARED RECEIVER

    Directory of Open Access Journals (Sweden)

    A. K. Esman

    2016-01-01

    Full Text Available Recent research and development show promising use of high-performance solid-state receivers of the electromagnetic radiation. These receivers are based on the low-barrier Schottky diodes. The approach to the design of the receivers on the basis of delta-doped low-barrier Schottky diodes with beam leads without bias is especially actively developing because for uncooled receivers of the microwave radiation these diodes have virtually no competition. The purpose of this work is to improve the main parameters and characteristics that determine the practical relevance of the receivers of mid-infrared electromagnetic radiation at the operating room temperature by modifying the electrodes configuration of the diode and optimizing the distance between them. Proposed original design solution of the integrated receiver of mid-infrared radiation on the basis of the low-barrier Schottky diodes with beam leads allows to effectively adjust its main parameters and characteristics. Simulation of the electromagnetic characteristics of the proposed receiver by using the software package HFSS with the basic algorithm of a finite element method which implemented to calculate the behavior of electromagnetic fields on an arbitrary geometry with a predetermined material properties have shown that when the inner parts of the electrodes of the low-barrier Schottky diode is performed in the concentric elliptical convex-concave shape, it can be reduce the reflection losses to -57.75 dB and the standing wave ratio to 1.003 while increasing the directivity up to 23 at a wavelength of 6.09 μm. At this time, the rounded radii of the inner parts of the anode and cathode electrodes are equal 212 nm and 318 nm respectively and the gap setting between them is 106 nm. These parameters will improve the efficiency of the developed infrared optical-promising and electronic equipment for various purposes intended for work in the mid-infrared wavelength range. 

  5. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  6. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  7. Bayesian anti-sparse coding

    CERN Document Server

    Elvira, Clément; Dobigeon, Nicolas

    2015-01-01

    Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...

  8. Hedging Strategies for Bayesian Optimization

    CERN Document Server

    Brochu, Eric; de Freitas, Nando

    2010-01-01

    Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.

  9. Bayesian Inference with Optimal Maps

    CERN Document Server

    Moselhy, Tarek A El

    2011-01-01

    We present a new approach to Bayesian inference that entirely avoids Markov chain simulation, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of an optimization problem, exploiting gradient information from the forward model when possible. The resulting algorithm overcomes many of the computational bottlenecks associated with Markov chain Monte Carlo. Advantages of a map-based representation of the posterior include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent posterior samples without additional likelihood evaluations or forward solves. The optimization approach also provides clear convergence criteria for posterior approximation and facilitates model selectio...

  10. Bayesian Approach to Inverse Problems

    CERN Document Server

    2008-01-01

    Many scientific, medical or engineering problems raise the issue of recovering some physical quantities from indirect measurements; for instance, detecting or quantifying flaws or cracks within a material from acoustic or electromagnetic measurements at its surface is an essential problem of non-destructive evaluation. The concept of inverse problems precisely originates from the idea of inverting the laws of physics to recover a quantity of interest from measurable data.Unfortunately, most inverse problems are ill-posed, which means that precise and stable solutions are not easy to devise. Regularization is the key concept to solve inverse problems.The goal of this book is to deal with inverse problems and regularized solutions using the Bayesian statistical tools, with a particular view to signal and image estimation

  11. A Bayesian Reflection on Surfaces

    Directory of Open Access Journals (Sweden)

    David R. Wolf

    1999-10-01

    Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.

  12. MACROECONOMIC FORECASTING USING BAYESIAN VECTOR AUTOREGRESSIVE APPROACH

    Directory of Open Access Journals (Sweden)

    D. Tutberidze

    2017-04-01

    Full Text Available There are many arguments that can be advanced to support the forecasting activities of business entities. The underlying argument in favor of forecasting is that managerial decisions are significantly dependent on proper evaluation of future trends as market conditions are constantly changing and require a detailed analysis of future dynamics. The article discusses the importance of using reasonable macro-econometric tool by suggesting the idea of conditional forecasting through a Vector Autoregressive (VAR modeling framework. Under this framework, a macroeconomic model for Georgian economy is constructed with the few variables believed to be shaping business environment. Based on the model, forecasts of macroeconomic variables are produced, and three types of scenarios are analyzed - a baseline and two alternative ones. The results of the study provide confirmatory evidence that suggested methodology is adequately addressing the research phenomenon and can be used widely by business entities in responding their strategic and operational planning challenges. Given this set-up, it is shown empirically that Bayesian Vector Autoregressive approach provides reasonable forecasts for the variables of interest.

  13. Bayesian Model Selection for LISA Pathfinder

    CERN Document Server

    Karnesis, Nikolaos; Sopuerta, Carlos F; Gibert, Ferran; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Ferraioli, Luigi; Hewitson, Martin; Hueller, Mauro; Korsakova, Natalia; Plagnol, Eric; Vitale, and Stefano

    2013-01-01

    The main goal of the LISA Pathfinder (LPF) mission is to fully characterize the acceleration noise models and to test key technologies for future space-based gravitational-wave observatories similar to the LISA/eLISA concept. The Data Analysis (DA) team has developed complex three-dimensional models of the LISA Technology Package (LTP) experiment on-board LPF. These models are used for simulations, but more importantly, they will be used for parameter estimation purposes during flight operations. One of the tasks of the DA team is to identify the physical effects that contribute significantly to the properties of the instrument noise. A way of approaching to this problem is to recover the essential parameters of the LTP which describe the data. Thus, we want to define the simplest model that efficiently explains the observations. To do so, adopting a Bayesian framework, one has to estimate the so-called Bayes Factor between two competing models. In our analysis, we use three main different methods to estimate...

  14. A Bayesian Networks in Intrusion Detection Systems

    Directory of Open Access Journals (Sweden)

    M. Mehdi

    2007-01-01

    Full Text Available Intrusion detection systems (IDSs have been widely used to overcome security threats in computer networks. Anomaly-based approaches have the advantage of being able to detect previously unknown attacks, but they suffer from the difficulty of building robust models of acceptable behaviour which may result in a large number of false alarms caused by incorrect classification of events in current systems. We propose a new approach of an anomaly Intrusion detection system (IDS. It consists of building a reference behaviour model and the use of a Bayesian classification procedure associated to unsupervised learning algorithm to evaluate the deviation between current and reference behaviour. Continuous re-estimation of model parameters allows for real time operation. The use of recursive Log-likelihood and entropy estimation as a measure for monitoring model degradation related with behavior changes and the associated model update show that the accuracy of the event classification process is significantly improved using our proposed approach for reducing the missing-alarm.

  15. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  16. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  17. An Enhancement of Bayesian Inference Network for Ligand-Based Virtual Screening using Features Selection

    Directory of Open Access Journals (Sweden)

    Ali Ahmed

    2011-01-01

    Full Text Available Problem statement: Similarity based Virtual Screening (VS deals with a large amount of data containing irrelevant and/or redundant fragments or features. Recent use of Bayesian network as an alternative for existing tools for similarity based VS has received noticeable attention of the researchers in the field of chemoinformatics. Approach: To this end, different models of Bayesian network have been developed. In this study, we enhance the Bayesian Inference Network (BIN using a subset of selected molecules features. Results: In this approach, a few features were filtered from the molecular fingerprint features based on a features selection approach. Conclusion: Simulated virtual screening experiments with MDL Drug Data Report (MDDR data sets showed that the proposed method provides simple ways of enhancing the cost effectiveness of ligand-based virtual screening searches, especially for higher diversity data set.

  18. The Diagnosis of Reciprocating Machinery by Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.

  19. A Bayesian Combination Forecasting Model for Retail Supply Chain Coordination

    Directory of Open Access Journals (Sweden)

    W.J. Wang

    2014-04-01

    Full Text Available Retailing plays an important part in modern economic development, and supply chain coordination is the research focus in retail operations management. This paper reviews the collaborative forecasting process within the framework of the collaborative planning, forecasting and replenishment of retail supply chain. A Bayesian combination forecasting model is proposed to integrate multiple forecasting resources and coordinate forecasting processes among partners in the retail supply chain. Based on simulation results for retail sales, the effectiveness of this combination forecasting model is demonstrated for coordinating the collaborative forecasting processes, resulting in an improvement of demand forecasting accuracy in the retail supply chain.

  20. Implementing relevance feedback in ligand-based virtual screening using Bayesian inference network.

    Science.gov (United States)

    Abdo, Ammar; Salim, Naomie; Ahmed, Ali

    2011-10-01

    Recently, the use of the Bayesian network as an alternative to existing tools for similarity-based virtual screening has received noticeable attention from researchers in the chemoinformatics field. The main aim of the Bayesian network model is to improve the retrieval effectiveness of similarity-based virtual screening. To this end, different models of the Bayesian network have been developed. In our previous works, the retrieval performance of the Bayesian network was observed to improve significantly when multiple reference structures or fragment weightings were used. In this article, the authors enhance the Bayesian inference network (BIN) using the relevance feedback information. In this approach, a few high-ranking structures of unknown activity were filtered from the outputs of BIN, based on a single active reference structure, to form a set of active reference structures. This set of active reference structures was used in two distinct techniques for carrying out such BIN searching: reweighting the fragments in the reference structures and group fusion techniques. Simulated virtual screening experiments with three MDL Drug Data Report data sets showed that the proposed techniques provide simple ways of enhancing the cost-effectiveness of ligand-based virtual screening searches, especially for higher diversity data sets.

  1. A Bayesian approach to matched field processing in uncertain ocean environments

    Institute of Scientific and Technical Information of China (English)

    LI Jianlong; PAN Xiang

    2008-01-01

    An approach of Bayesian Matched Field Processing(MFP)was discussed in the uncertain ocean environment.In this approach,uncertainty knowledge is modeled and spatial and temporal data Received by the array are fully used.Therefore,a mechanism for MFP is found.which well combines model-based and data-driven methods of uncertain field processing.By theoretical derivation,simulation analysis and the validation of the experimental array data at sea,we find that(1)the basic components of Bayesian matched field processors are the corresponding sets of Bartlett matched field processor,MVDR(minimum variance distortionless response)matched field processor,etc.;(2)Bayesian MVDR/Bartlett MFP are the weighted sum of the MVDR/Bartlett MFP,where the weighted coefficients are the values of the a posteriori probability;(3)with the uncertain ocean environment,Bayesian MFP can more correctly locate the source than MVDR MFP or Bartlett MFP;(4)Bayesian MFP call better suppress sidelobes of the ambiguity surfaces.

  2. An overview on Approximate Bayesian computation*

    Directory of Open Access Journals (Sweden)

    Baragatti Meïli

    2014-01-01

    Full Text Available Approximate Bayesian computation techniques, also called likelihood-free methods, are one of the most satisfactory approach to intractable likelihood problems. This overview presents recent results since its introduction about ten years ago in population genetics.

  3. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.

    2011-01-01

    We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...

  4. Bayesian analysis for the social sciences

    CERN Document Server

    Jackman, Simon

    2009-01-01

    Bayesian methods are increasingly being used in the social sciences, as the problems encountered lend themselves so naturally to the subjective qualities of Bayesian methodology. This book provides an accessible introduction to Bayesian methods, tailored specifically for social science students. It contains lots of real examples from political science, psychology, sociology, and economics, exercises in all chapters, and detailed descriptions of all the key concepts, without assuming any background in statistics beyond a first course. It features examples of how to implement the methods using WinBUGS - the most-widely used Bayesian analysis software in the world - and R - an open-source statistical software. The book is supported by a Website featuring WinBUGS and R code, and data sets.

  5. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  6. GPstuff: Bayesian Modeling with Gaussian Processes

    NARCIS (Netherlands)

    Vanhatalo, J.; Riihimaki, J.; Hartikainen, J.; Jylänki, P.P.; Tolvanen, V.; Vehtari, A.

    2013-01-01

    The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for Bayesian inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

  7. Domino effect analysis using Bayesian networks.

    Science.gov (United States)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio

    2013-02-01

    A new methodology is introduced based on Bayesian network both to model domino effect propagation patterns and to estimate the domino effect probability at different levels. The flexible structure and the unique modeling techniques offered by Bayesian network make it possible to analyze domino effects through a probabilistic framework, considering synergistic effects, noisy probabilities, and common cause failures. Further, the uncertainties and the complex interactions among the domino effect components are captured using Bayesian network. The probabilities of events are updated in the light of new information, and the most probable path of the domino effect is determined on the basis of the new data gathered. This study shows how probability updating helps to update the domino effect model either qualitatively or quantitatively. The methodology is applied to a hypothetical example and also to an earlier-studied case study. These examples accentuate the effectiveness of Bayesian network in modeling domino effects in processing facility. © 2012 Society for Risk Analysis.

  8. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  9. Variational Bayesian Approximation methods for inverse problems

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2012-09-01

    Variational Bayesian Approximation (VBA) methods are recent tools for effective Bayesian computations. In this paper, these tools are used for inverse problems where the prior models include hidden variables and where where the estimation of the hyper parameters has also to be addressed. In particular two specific prior models (Student-t and mixture of Gaussian models) are considered and details of the algorithms are given.

  10. Bayesian Modeling of a Human MMORPG Player

    CERN Document Server

    Synnaeve, Gabriel

    2010-01-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  11. Bayesian Modeling of a Human MMORPG Player

    Science.gov (United States)

    Synnaeve, Gabriel; Bessière, Pierre

    2011-03-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  12. Fuzzy Functional Dependencies and Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    LIU WeiYi(刘惟一); SONG Ning(宋宁)

    2003-01-01

    Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.

  13. A Bayesian classifier for symbol recognition

    OpenAIRE

    Barrat, Sabine; Tabbone, Salvatore; Nourrissier, Patrick

    2007-01-01

    URL : http://www.buyans.com/POL/UploadedFile/134_9977.pdf; International audience; We present in this paper an original adaptation of Bayesian networks to symbol recognition problem. More precisely, a descriptor combination method, which enables to improve significantly the recognition rate compared to the recognition rates obtained by each descriptor, is presented. In this perspective, we use a simple Bayesian classifier, called naive Bayes. In fact, probabilistic graphical models, more spec...

  14. Bayesian Inversion of Seabed Scattering Data

    Science.gov (United States)

    2014-09-30

    uncertainties as well as parameter values, thereby quantifying the information content of the data to resolve the model parameters. Bayesian inversion...method based on deviance information criterion to determine the dominant scattering mechanism is in development. This work has been compiled in two... Bayesian Inversion of Seabed Scattering Data (Special Research Award in Ocean Acoustics) Gavin A.M.W. Steininger School of Earth & Ocean

  15. Bayesian target tracking based on particle filter

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.

  16. ProFit: Bayesian galaxy fitting tool

    Science.gov (United States)

    Robotham, A. S. G.; Taranu, D.; Tobar, R.

    2016-12-01

    ProFit is a Bayesian galaxy fitting tool that uses the fast C++ image generation library libprofit (ascl:1612.003) and a flexible R interface to a large number of likelihood samplers. It offers a fully featured Bayesian interface to galaxy model fitting (also called profiling), using mostly the same standard inputs as other popular codes (e.g. GALFIT ascl:1104.010), but it is also able to use complex priors and a number of likelihoods.

  17. Bayesian Procedures for Identifying Aberrant Response-Time Patterns in Adaptive Testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Guo, Fanmin

    In order to identify aberrant response-time patterns on educational and psychological tests, it is important to be able to separate the speed at which the test taker operates from the time the items require. A lognormal model for response times with this feature was used to derive a Bayesian

  18. Bayesian Procedures for Identifying Aberrant Response-Time Patterns in Adaptive Testing

    NARCIS (Netherlands)

    Linden, van der Wim J.; Guo, Fanmin

    2008-01-01

    In order to identify aberrant response-time patterns on educational and psychological tests, it is important to be able to separate the speed at which the test taker operates from the time the items require. A lognormal model for response times with this feature was used to derive a Bayesian procedu

  19. Bayesian Procedures for Identifying Aberrant Response-Time Patterns in Adaptive Testing

    NARCIS (Netherlands)

    van der Linden, Willem J.; Guo, Fanmin

    2008-01-01

    In order to identify aberrant response-time patterns on educational and psychological tests, it is important to be able to separate the speed at which the test taker operates from the time the items require. A lognormal model for response times with this feature was used to derive a Bayesian procedu

  20. Planning of O&M for Offfshore Wind Turbines using Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Nielsen, Jannie Jessen; Sørensen, John Dalsgaard

    2010-01-01

    The costs to operation and maintenance (O&M) for offshore wind turbines are large, and riskbased planning of O&M has the potential of reducing these costs. This paper presents how Bayesian graphical models can be used to establish a probabilistic damage model and include data from imperfect...

  1. Philosophy and the practice of Bayesian statistics.

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework.

  2. An assessment of Bayesian bias estimator for numerical weather prediction

    Directory of Open Access Journals (Sweden)

    J. Son

    2008-12-01

    Full Text Available Various statistical methods are used to process operational Numerical Weather Prediction (NWP products with the aim of reducing forecast errors and they often require sufficiently large training data sets. Generating such a hindcast data set for this purpose can be costly and a well designed algorithm should be able to reduce the required size of these data sets.

    This issue is investigated with the relatively simple case of bias correction, by comparing a Bayesian algorithm of bias estimation with the conventionally used empirical method. As available forecast data sets are not large enough for a comprehensive test, synthetically generated time series representing the analysis (truth and forecast are used to increase the sample size. Since these synthetic time series retained the statistical characteristics of the observations and operational NWP model output, the results of this study can be extended to real observation and forecasts and this is confirmed by a preliminary test with real data.

    By using the climatological mean and standard deviation of the meteorological variable in consideration and the statistical relationship between the forecast and the analysis, the Bayesian bias estimator outperforms the empirical approach in terms of the accuracy of the estimated bias, and it can reduce the required size of the training sample by a factor of 3. This advantage of the Bayesian approach is due to the fact that it is less liable to the sampling error in consecutive sampling. These results suggest that a carefully designed statistical procedure may reduce the need for the costly generation of large hindcast datasets.

  3. Bayesian analysis of rotating machines - A statistical approach to estimate and track the fundamental frequency

    DEFF Research Database (Denmark)

    Pedersen, Thorkild Find

    2003-01-01

    Rotating and reciprocating mechanical machines emit acoustic noise and vibrations when they operate. Typically, the noise and vibrations are concentrated in narrow frequency bands related to the running speed of the machine. The frequency of the running speed is referred to as the fundamental...... of an adaptive comb filter is derived for tracking non-stationary signals. The estimation problem is then rephrased in terms of the Bayesian statistical framework. In the Bayesian framework both parameters and observations are considered stochastic processes. The result of the estimation is an expression...

  4. A Bayesian subgroup analysis with a zero-enriched Polya Urn scheme.

    Science.gov (United States)

    Sivaganesan, S; Laud, Purushottam W; Müller, Peter

    2011-02-20

    We introduce a new approach to inference for subgroups in clinical trials. We use Bayesian model selection, and a threshold on posterior model probabilities to identify subgroup effects for reporting. For each covariate of interest, we define a separate class of models, and use the posterior probability associated with each model and the threshold to determine the existence of a subgroup effect. As usual in Bayesian clinical trial design we compute frequentist operating characteristics, and achieve the desired error probabilities by choosing an appropriate threshold(s) for the posterior probabilities. 2010 John Wiley & Sons, Ltd.

  5. Bayesian inference for OPC modeling

    Science.gov (United States)

    Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.

    2016-03-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.

  6. Bayesian analysis of cosmic structures

    CERN Document Server

    Kitaura, Francisco-Shu

    2011-01-01

    We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...

  7. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.

  8. Validation of transting planet candidates: a Bayesian view

    Science.gov (United States)

    Díaz, Rodrigo Fernando; Almenara, Jose Manuel; Santerne, Alexandre

    2015-08-01

    Transiting candidate validation is essentially a Bayesian model comparison problem: different models, all explaining the observations comparably well, compete for the support of the available data. It has, however, two particularities that render it very complex and difficult to tackle: i) the relevant data sets are of diverse nature (transit light curves, broad band photometry, high angular resolution images, radial velocity observations, etc.), and ii) the models representing each hypothesis are highly non-linear and in some cases make the computation of the likelihood very time consuming.Despite its clear Bayesian nature, the planet validation problem has received in the past mainly a frequentist treatment (BLENDER). Other techniques exist, but they employ unrealistic models that increase speed but only partially exploit the available datasets (ValFast).The Planet Analysis and Small Transit Investigation Software (PASTIS) was developped keeping these issues and the characteristics of the problem in mind. It aims at computing the Bayesian evidence for a full set of false positive scenarios and the planet hypothesis, modelling in all cases the available data self-consistently, thus producing robust and rigorous Bayes factor for all models of interest. Its object-oriented architecture also permits constructing a vast set of false positive models easily.I will review some key results of the planet validation technique, showing the limitations and dangers of some approaches and of the validation technique in general. I will also describe the PASTIS tool and present out results on CoRoT-22 b, Kepler-22 b, and other transiting candidates.

  9. Using Bayesian networks to analyze occupational stress caused by work demands: preventing stress through social support.

    Science.gov (United States)

    García-Herrero, Susana; Mariscal, M A; Gutiérrez, J M; Ritzel, Dale O

    2013-08-01

    Occupational stress is a major health hazard and a serious challenge to the effective operation of any company and represents a major problem for both individuals and organizations. Previous researches have shown that high demands (e.g. workload, emotional) combined with low resources (e.g. support, control, rewards) are associated with adverse health (e.g. psychological, physical) and organizational impacts (e.g. reduced job satisfaction, sickness absence). The objective of the present work is to create a model to analyze how social support reduces the occupational stress caused by work demands. This study used existing Spanish national data on working conditions collected by the Spanish Ministry of Labour and Immigration in 2007, where 11,054 workers were interviewed by questionnaire. A probabilistic model was built using Bayesian networks to explain the relationships between work demands and occupational stress. The model also explains how social support contributes positively to reducing stress levels. The variables studied were intellectually demanding work, overwork, workday, stress, and social support. The results show the importance of social support and of receiving help from supervisors and co-workers in preventing occupational stress. The study provides a new methodology that explains and quantifies the effects of intellectually demanding work, overwork, and workday in occupational stress. Also, the study quantifies the importance of social support to reduce occupational stress.

  10. Comparison of Two Gas Selection Methodologies: An Application of Bayesian Model Averaging

    Energy Technology Data Exchange (ETDEWEB)

    Renholds, Andrea S.; Thompson, Sandra E.; Anderson, Kevin K.; Chilton, Lawrence K.

    2006-03-31

    One goal of hyperspectral imagery analysis is the detection and characterization of plumes. Characterization includes identifying the gases in the plumes, which is a model selection problem. Two gas selection methods compared in this report are Bayesian model averaging (BMA) and minimum Akaike information criterion (AIC) stepwise regression (SR). Simulated spectral data from a three-layer radiance transfer model were used to compare the two methods. Test gases were chosen to span the types of spectra observed, which exhibit peaks ranging from broad to sharp. The size and complexity of the search libraries were varied. Background materials were chosen to either replicate a remote area of eastern Washington or feature many common background materials. For many cases, BMA and SR performed the detection task comparably in terms of the receiver operating characteristic curves. For some gases, BMA performed better than SR when the size and complexity of the search library increased. This is encouraging because we expect improved BMA performance upon incorporation of prior information on background materials and gases.

  11. Risk-Based Operation and Maintenance Using Bayesian Networks

    DEFF Research Database (Denmark)

    Nielsen, Jannie Jessen; Sørensen, John Dalsgaard

    2011-01-01

    This paper describes how risk-based decision making can be used for maintenance planning of components exposed to degradation such as fatigue in offshore wind turbines. In fatigue models, large epistemic uncertainties are usually present. These can be reduced if monitoring results are used...

  12. Mobile sensor network noise reduction and recalibration using a Bayesian network

    Science.gov (United States)

    Xiang, Y.; Tang, Y.; Zhu, W.

    2016-02-01

    People are becoming increasingly interested in mobile air quality sensor network applications. By eliminating the inaccuracies caused by spatial and temporal heterogeneity of pollutant distributions, this method shows great potential for atmospheric research. However, systems based on low-cost air quality sensors often suffer from sensor noise and drift. For the sensing systems to operate stably and reliably in real-world applications, those problems must be addressed. In this work, we exploit the correlation of different types of sensors caused by cross sensitivity to help identify and correct the outlier readings. By employing a Bayesian network based system, we are able to recover the erroneous readings and recalibrate the drifted sensors simultaneously. Our method improves upon the state-of-art Bayesian belief network techniques by incorporating the virtual evidence and adjusting the sensor calibration functions recursively.Specifically, we have (1) designed a system based on the Bayesian belief network to detect and recover the abnormal readings, (2) developed methods to update the sensor calibration functions infield without requirement of ground truth, and (3) extended the Bayesian network with virtual evidence for infield sensor recalibration. To validate our technique, we have tested our technique with metal oxide sensors measuring NO2, CO, and O3 in a real-world deployment. Compared with the existing Bayesian belief network techniques, results based on our experiment setup demonstrate that our system can reduce error by 34.1 % and recover 4 times more data on average.

  13. A cryogenic receiver for EPR.

    Science.gov (United States)

    Narkowicz, R; Ogata, H; Reijerse, E; Suter, D

    2013-12-01

    Cryogenic probes have significantly increased the sensitivity of NMR. Here, we present a compact EPR receiver design capable of cryogenic operation. Compared to room temperature operation, it reduces the noise by a factor of ≈2.5. We discuss in detail the design and analyze the resulting noise performance. At low microwave power, the input noise density closely follows the emission of a cooled 50Ω resistor over the whole measurement range from 20K up to room temperature. To minimize the influence of the microwave source noise, we use high microwave efficiency (≈1.1-1.7mTW(-1/2)) planar microresonators. Their efficient conversion of microwave power to magnetic field permits EPR measurements with very low power levels, typically ranging from a few μW down to fractions of nW. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. An introduction to Gaussian Bayesian networks.

    Science.gov (United States)

    Grzegorczyk, Marco

    2010-01-01

    The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain.

  15. A Bayesian Approach for Sensor Optimisation in Impact Identification

    Directory of Open Access Journals (Sweden)

    Vincenzo Mallardo

    2016-11-01

    Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  16. A boosted Bayesian multiresolution classifier for prostate cancer detection from digitized needle biopsies.

    Science.gov (United States)

    Doyle, Scott; Feldman, Michael; Tomaszewski, John; Madabhushi, Anant

    2012-05-01

    Diagnosis of prostate cancer (CaP) currently involves examining tissue samples for CaP presence and extent via a microscope, a time-consuming and subjective process. With the advent of digital pathology, computer-aided algorithms can now be applied to disease detection on digitized glass slides. The size of these digitized histology images (hundreds of millions of pixels) presents a formidable challenge for any computerized image analysis program. In this paper, we present a boosted Bayesian multiresolution (BBMR) system to identify regions of CaP on digital biopsy slides. Such a system would serve as an important preceding step to a Gleason grading algorithm, where the objective would be to score the invasiveness and severity of the disease. In the first step, our algorithm decomposes the whole-slide image into an image pyramid comprising multiple resolution levels. Regions identified as cancer via a Bayesian classifier at lower resolution levels are subsequently examined in greater detail at higher resolution levels, thereby allowing for rapid and efficient analysis of large images. At each resolution level, ten image features are chosen from a pool of over 900 first-order statistical, second-order co-occurrence, and Gabor filter features using an AdaBoost ensemble method. The BBMR scheme, operating on 100 images obtained from 58 patients, yielded: 1) areas under the receiver operating characteristic curve (AUC) of 0.84, 0.83, and 0.76, respectively, at the lowest, intermediate, and highest resolution levels and 2) an eightfold savings in terms of computational time compared to running the algorithm directly at full (highest) resolution. The BBMR model outperformed (in terms of AUC): 1) individual features (no ensemble) and 2) a random forest classifier ensemble obtained by bagging multiple decision tree classifiers. The apparent drop-off in AUC at higher image resolutions is due to lack of fine detail in the expert annotation of CaP and is not an artifact of the

  17. Zero-power receiver

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert W.

    2016-10-04

    An unpowered signal receiver and a method for signal reception detects and responds to very weak signals using pyroelectric devices as impedance transformers and/or demodulators. In some embodiments, surface acoustic wave devices (SAW) are also used. Illustrative embodiments include satellite and long distance terrestrial communications applications.

  18. Sender-Receiver Games

    NARCIS (Netherlands)

    Peeters, R.J.A.P.; Potters, J.A.M.

    1999-01-01

    Standard game-theoretic solution concepts do not guarantee meaningful commu- nication in cheap-talk games. In this paper, we define a solution concept which guarantees communication for a large class of games by designing a behavior pro- tocol which the receiver uses to judge messages sent by the

  19. Bayesian Calibration of Microsimulation Models.

    Science.gov (United States)

    Rutter, Carolyn M; Miglioretti, Diana L; Savarino, James E

    2009-12-01

    Microsimulation models that describe disease processes synthesize information from multiple sources and can be used to estimate the effects of screening and treatment on cancer incidence and mortality at a population level. These models are characterized by simulation of individual event histories for an idealized population of interest. Microsimulation models are complex and invariably include parameters that are not well informed by existing data. Therefore, a key component of model development is the choice of parameter values. Microsimulation model parameter values are selected to reproduce expected or known results though the process of model calibration. Calibration may be done by perturbing model parameters one at a time or by using a search algorithm. As an alternative, we propose a Bayesian method to calibrate microsimulation models that uses Markov chain Monte Carlo. We show that this approach converges to the target distribution and use a simulation study to demonstrate its finite-sample performance. Although computationally intensive, this approach has several advantages over previously proposed methods, including the use of statistical criteria to select parameter values, simultaneous calibration of multiple parameters to multiple data sources, incorporation of information via prior distributions, description of parameter identifiability, and the ability to obtain interval estimates of model parameters. We develop a microsimulation model for colorectal cancer and use our proposed method to calibrate model parameters. The microsimulation model provides a good fit to the calibration data. We find evidence that some parameters are identified primarily through prior distributions. Our results underscore the need to incorporate multiple sources of variability (i.e., due to calibration data, unknown parameters, and estimated parameters and predicted values) when calibrating and applying microsimulation models.

  20. Multi-Fraction Bayesian Sediment Transport Model

    Directory of Open Access Journals (Sweden)

    Mark L. Schmelter

    2015-09-01

    Full Text Available A Bayesian approach to sediment transport modeling can provide a strong basis for evaluating and propagating model uncertainty, which can be useful in transport applications. Previous work in developing and applying Bayesian sediment transport models used a single grain size fraction or characterized the transport of mixed-size sediment with a single characteristic grain size. Although this approach is common in sediment transport modeling, it precludes the possibility of capturing processes that cause mixed-size sediments to sort and, thereby, alter the grain size available for transport and the transport rates themselves. This paper extends development of a Bayesian transport model from one to k fractional dimensions. The model uses an existing transport function as its deterministic core and is applied to the dataset used to originally develop the function. The Bayesian multi-fraction model is able to infer the posterior distributions for essential model parameters and replicates predictive distributions of both bulk and fractional transport. Further, the inferred posterior distributions are used to evaluate parametric and other sources of variability in relations representing mixed-size interactions in the original model. Successful OPEN ACCESS J. Mar. Sci. Eng. 2015, 3 1067 development of the model demonstrates that Bayesian methods can be used to provide a robust and rigorous basis for quantifying uncertainty in mixed-size sediment transport. Such a method has heretofore been unavailable and allows for the propagation of uncertainty in sediment transport applications.

  1. Bayesian modeling of flexible cognitive control

    Science.gov (United States)

    Jiang, Jiefeng; Heller, Katherine; Egner, Tobias

    2014-01-01

    “Cognitive control” describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation. PMID:24929218

  2. Bayesian modeling of flexible cognitive control.

    Science.gov (United States)

    Jiang, Jiefeng; Heller, Katherine; Egner, Tobias

    2014-10-01

    "Cognitive control" describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation.

  3. Dimensionality reduction in Bayesian estimation algorithms

    Directory of Open Access Journals (Sweden)

    G. W. Petty

    2013-03-01

    Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  4. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.

  5. Bayesian model selection: Evidence estimation based on DREAM simulation and bridge sampling

    Science.gov (United States)

    Volpi, Elena; Schoups, Gerrit; Firmani, Giovanni; Vrugt, Jasper A.

    2017-04-01

    Bayesian inference has found widespread application in Earth and Environmental Systems Modeling, providing an effective tool for prediction, data assimilation, parameter estimation, uncertainty analysis and hypothesis testing. Under multiple competing hypotheses, the Bayesian approach also provides an attractive alternative to traditional information criteria (e.g. AIC, BIC) for model selection. The key variable for Bayesian model selection is the evidence (or marginal likelihood) that is the normalizing constant in the denominator of Bayes theorem; while it is fundamental for model selection, the evidence is not required for Bayesian inference. It is computed for each hypothesis (model) by averaging the likelihood function over the prior parameter distribution, rather than maximizing it as by information criteria; the larger a model evidence the more support it receives among a collection of hypothesis as the simulated values assign relatively high probability density to the observed data. Hence, the evidence naturally acts as an Occam's razor, preferring simpler and more constrained models against the selection of over-fitted ones by information criteria that incorporate only the likelihood maximum. Since it is not particularly easy to estimate the evidence in practice, Bayesian model selection via the marginal likelihood has not yet found mainstream use. We illustrate here the properties of a new estimator of the Bayesian model evidence, which provides robust and unbiased estimates of the marginal likelihood; the method is coined Gaussian Mixture Importance Sampling (GMIS). GMIS uses multidimensional numerical integration of the posterior parameter distribution via bridge sampling (a generalization of importance sampling) of a mixture distribution fitted to samples of the posterior distribution derived from the DREAM algorithm (Vrugt et al., 2008; 2009). Some illustrative examples are presented to show the robustness and superiority of the GMIS estimator with

  6. Forecasting the 2012 and 2014 Elections Using Bayesian Prediction and Optimization

    Directory of Open Access Journals (Sweden)

    Steven E. Rigdon

    2015-04-01

    Full Text Available This article presents a data-driven Bayesian model used to predict the state-by-state winners in the Senate and presidential elections in 2012 and 2014. The Bayesian model takes into account the proportions of polled subjects who favor each candidate and the proportion who are undecided, and produces a posterior probability that each candidate will win each state. From this, a dynamic programming algorithm is used to compute the probability mass functions for the number of electoral votes that each presidential candidate receives and the number of Senate seats that each party receives. On the final day before the 2012 election, the model gave a probability of (essentially one that President Obama would be reelected, and that the Democrats would retain control of the U.S. Senate. In 2014, the model gave a final probability of .99 that the Republicans would take control of the Senate.

  7. The Psychology of Bayesian Reasoning

    Science.gov (United States)

    2014-10-21

    the frequency tree in Figure 1 correspond to cells a-d, which have received much attention in the causal induction literature (Mandel and Lehman ...J. Exp. Psychol. Appl. 11, 277-288. Mandel, D. R., and Lehman , D. R. (1998). Integration of contingency information in judgments of cause

  8. Bayesian Fusion of Multi-Band Images

    CERN Document Server

    Wei, Qi; Tourneret, Jean-Yves

    2013-01-01

    In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...

  9. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang

    2006-01-01

    by such adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...... efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....

  10. Hessian PDF reweighting meets the Bayesian methods

    CERN Document Server

    Paukkunen, Hannu

    2014-01-01

    We discuss the Hessian PDF reweighting - a technique intended to estimate the effects that new measurements have on a set of PDFs. The method stems straightforwardly from considering new data in a usual $\\chi^2$-fit and it naturally incorporates also non-zero values for the tolerance, $\\Delta\\chi^2>1$. In comparison to the contemporary Bayesian reweighting techniques, there is no need to generate large ensembles of PDF Monte-Carlo replicas, and the observables need to be evaluated only with the central and the error sets of the original PDFs. In spite of the apparently rather different methodologies, we find that the Hessian and the Bayesian techniques are actually equivalent if the $\\Delta\\chi^2$ criterion is properly included to the Bayesian likelihood function that is a simple exponential.

  11. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  12. Bayesian Image Reconstruction Based on Voronoi Diagrams

    CERN Document Server

    Cabrera, G F; Hitschfeld, N

    2007-01-01

    We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.

  13. Bayesian modeling of unknown diseases for biosurveillance.

    Science.gov (United States)

    Shen, Yanna; Cooper, Gregory F

    2009-11-14

    This paper investigates Bayesian modeling of unknown causes of events in the context of disease-outbreak detection. We introduce a Bayesian approach that models and detects both (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A key contribution of this paper is that it introduces a Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has broad applicability in medical informatics, where the space of known causes of outcomes of interest is seldom complete.

  14. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...

  15. Variational Bayesian Inference of Line Spectra

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri

    2016-01-01

    In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs...

  16. Event generator tuning using Bayesian optimization

    CERN Document Server

    Ilten, Philip; Yang, Yunjie

    2016-01-01

    Monte Carlo event generators contain a large number of parameters that must be determined by comparing the output of the generator with experimental data. Generating enough events with a fixed set of parameter values to enable making such a comparison is extremely CPU intensive, which prohibits performing a simple brute-force grid-based tuning of the parameters. Bayesian optimization is a powerful method designed for such black-box tuning applications. In this article, we show that Monte Carlo event generator parameters can be accurately obtained using Bayesian optimization and minimal expert-level physics knowledge. A tune of the PYTHIA 8 event generator using $e^+e^-$ events, where 20 parameters are optimized, can be run on a modern laptop in just two days. Combining the Bayesian optimization approach with expert knowledge should enable producing better tunes in the future, by making it faster and easier to study discrepancies between Monte Carlo and experimental data.

  17. Learning Bayesian Networks from Correlated Data

    Science.gov (United States)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola

    2016-05-01

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  18. Learning Bayesian Networks from Correlated Data.

    Science.gov (United States)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H; Perls, Thomas T; Sebastiani, Paola

    2016-05-05

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  19. Bayesian analysis of MEG visual evoked responses

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  20. Bayesian Analysis of Perceived Eye Level

    Science.gov (United States)

    Orendorff, Elaine E.; Kalesinskas, Laurynas; Palumbo, Robert T.; Albert, Mark V.

    2016-01-01

    To accurately perceive the world, people must efficiently combine internal beliefs and external sensory cues. We introduce a Bayesian framework that explains the role of internal balance cues and visual stimuli on perceived eye level (PEL)—a self-reported measure of elevation angle. This framework provides a single, coherent model explaining a set of experimentally observed PEL over a range of experimental conditions. Further, it provides a parsimonious explanation for the additive effect of low fidelity cues as well as the averaging effect of high fidelity cues, as also found in other Bayesian cue combination psychophysical studies. Our model accurately estimates the PEL and explains the form of previous equations used in describing PEL behavior. Most importantly, the proposed Bayesian framework for PEL is more powerful than previous behavioral modeling; it permits behavioral estimation in a wider range of cue combination and perceptual studies than models previously reported. PMID:28018204

  1. Dynamic Bayesian Combination of Multiple Imperfect Classifiers

    CERN Document Server

    Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon

    2012-01-01

    Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...

  2. Bayesian Inference in Polling Technique: 1992 Presidential Polls.

    Science.gov (United States)

    Satake, Eiki

    1994-01-01

    Explores the potential utility of Bayesian statistical methods in determining the predictability of multiple polls. Compares Bayesian techniques to the classical statistical method employed by pollsters. Considers these questions in the context of the 1992 presidential elections. (HB)

  3. Bayesian mapping QTL for fruit and growth phenological traits in ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-01-19

    Jan 19, 2009 ... for breeding purposes and scientific reasoning. ... these traits, but also is useful for marker-assisted selec- ... Bayesian model selection within the framework of. Bayesian ...... pattern of tomato carpel shape well before anthesis.

  4. Flood alert system based on bayesian techniques

    Science.gov (United States)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  5. Structure learning for Bayesian networks as models of biological networks.

    Science.gov (United States)

    Larjo, Antti; Shmulevich, Ilya; Lähdesmäki, Harri

    2013-01-01

    Bayesian networks are probabilistic graphical models suitable for modeling several kinds of biological systems. In many cases, the structure of a Bayesian network represents causal molecular mechanisms or statistical associations of the underlying system. Bayesian networks have been applied, for example, for inferring the structure of many biological networks from experimental data. We present some recent progress in learning the structure of static and dynamic Bayesian networks from data.

  6. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  7. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;

    2011-01-01

    techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing......We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....

  8. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  9. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  10. Variational Bayesian Inference of Line Spectra

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri

    2017-01-01

    In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid...

  11. Bayesian long branch attraction bias and corrections.

    Science.gov (United States)

    Susko, Edward

    2015-03-01

    Previous work on the star-tree paradox has shown that Bayesian methods suffer from a long branch attraction bias. That work is extended to settings involving more taxa and partially resolved trees. The long branch attraction bias is confirmed to arise more broadly and an additional source of bias is found. A by-product of the analysis is methods that correct for biases toward particular topologies. The corrections can be easily calculated using existing Bayesian software. Posterior support for a set of two or more trees can thus be supplemented with corrected versions to cross-check or replace results. Simulations show the corrections to be highly effective.

  12. From retrodiction to Bayesian quantum imaging

    Science.gov (United States)

    Speirits, Fiona C.; Sonnleitner, Matthias; Barnett, Stephen M.

    2017-04-01

    We employ quantum retrodiction to develop a robust Bayesian algorithm for reconstructing the intensity values of an image from sparse photocount data, while also accounting for detector noise in the form of dark counts. This method yields not only a reconstructed image but also provides the full probability distribution function for the intensity at each pixel. We use simulated as well as real data to illustrate both the applications of the algorithm and the analysis options that are only available when the full probability distribution functions are known. These include calculating Bayesian credible regions for each pixel intensity, allowing an objective assessment of the reliability of the reconstructed image intensity values.

  13. Bayesian Optimisation Algorithm for Nurse Scheduling

    CERN Document Server

    Li, Jingpeng

    2008-01-01

    Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurses assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.

  14. Bayesian signal processing techniques for GNSS receivers: from multipath mitigation to positioning

    OpenAIRE

    2009-01-01

    Aquesta tesi gira al voltant del disseny de receptors per a sistemes globals de navegació per satèl·lit (Global Navigation Satellite Systems, GNSS). El terme GNSS fa referència a tots aquells sistemes de navegació basats en una constel·lació de satèl·lits que emeten senyals de navegació útils per a posicionament. El més popular és l'americà GPS, emprat globalment. Els esforços d'Europa per a tenir un sistema similar veuran el seu fruit en un futur proper, el sistema s'anomena Galileo. Altres...

  15. Bayesian penalized log-likelihood ratio approach for dose response clinical trial studies.

    Science.gov (United States)

    Tang, Yuanyuan; Cai, Chunyan; Sun, Liangrui; He, Jianghua

    2017-02-13

    In literature, there are a few unified approaches to test proof of concept and estimate a target dose, including the multiple comparison procedure using modeling approach, and the permutation approach proposed by Klingenberg. We discuss and compare the operating characteristics of these unified approaches and further develop an alternative approach in a Bayesian framework based on the posterior distribution of a penalized log-likelihood ratio test statistic. Our Bayesian approach is much more flexible to handle linear or nonlinear dose-response relationships and is more efficient than the permutation approach. The operating characteristics of our Bayesian approach are comparable to and sometimes better than both approaches in a wide range of dose-response relationships. It yields credible intervals as well as predictive distribution for the response rate at a specific dose level for the target dose estimation. Our Bayesian approach can be easily extended to continuous, categorical, and time-to-event responses. We illustrate the performance of our proposed method with extensive simulations and Phase II clinical trial data examples.

  16. From arguments to constraints on a Bayesian network

    NARCIS (Netherlands)

    Bex, F.J.; Renooij, S.

    2016-01-01

    In this paper, we propose a way to derive constraints for a Bayesian Network from structured arguments. Argumentation and Bayesian networks can both be considered decision support techniques, but are typically used by experts with different backgrounds. Bayesian network experts have the mathematical

  17. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA desig

  18. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...

  19. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t

  20. Bayesian Just-So Stories in Psychology and Neuroscience

    Science.gov (United States)

    Bowers, Jeffrey S.; Davis, Colin J.

    2012-01-01

    According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…

  1. A SAS Interface for Bayesian Analysis with WinBUGS

    Science.gov (United States)

    Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki

    2008-01-01

    Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…

  2. Prior approval: the growth of Bayesian methods in psychology.

    Science.gov (United States)

    Andrews, Mark; Baguley, Thom

    2013-02-01

    Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.

  3. Inferring on the intentions of others by hierarchical Bayesian learning.

    Directory of Open Access Journals (Sweden)

    Andreea O Diaconescu

    2014-09-01

    Full Text Available Inferring on others' (potentially time-varying intentions is a fundamental problem during many social transactions. To investigate the underlying mechanisms, we applied computational modeling to behavioral data from an economic game in which 16 pairs of volunteers (randomly assigned to "player" or "adviser" roles interacted. The player performed a probabilistic reinforcement learning task, receiving information about a binary lottery from a visual pie chart. The adviser, who received more predictive information, issued an additional recommendation. Critically, the game was structured such that the adviser's incentives to provide helpful or misleading information varied in time. Using a meta-Bayesian modeling framework, we found that the players' behavior was best explained by the deployment of hierarchical learning: they inferred upon the volatility of the advisers' intentions in order to optimize their predictions about the validity of their advice. Beyond learning, volatility estimates also affected the trial-by-trial variability of decisions: participants were more likely to rely on their estimates of advice accuracy for making choices when they believed that the adviser's intentions were presently stable. Finally, our model of the players' inference predicted the players' interpersonal reactivity index (IRI scores, explicit ratings of the advisers' helpfulness and the advisers' self-reports on their chosen strategy. Overall, our results suggest that humans (i employ hierarchical generative models to infer on the changing intentions of others, (ii use volatility estimates to inform decision-making in social interactions, and (iii integrate estimates of advice accuracy with non-social sources of information. The Bayesian framework presented here can quantify individual differences in these mechanisms from simple behavioral readouts and may prove useful in future clinical studies of maladaptive social cognition.

  4. Inferring on the intentions of others by hierarchical Bayesian learning.

    Science.gov (United States)

    Diaconescu, Andreea O; Mathys, Christoph; Weber, Lilian A E; Daunizeau, Jean; Kasper, Lars; Lomakina, Ekaterina I; Fehr, Ernst; Stephan, Klaas E

    2014-09-01

    Inferring on others' (potentially time-varying) intentions is a fundamental problem during many social transactions. To investigate the underlying mechanisms, we applied computational modeling to behavioral data from an economic game in which 16 pairs of volunteers (randomly assigned to "player" or "adviser" roles) interacted. The player performed a probabilistic reinforcement learning task, receiving information about a binary lottery from a visual pie chart. The adviser, who received more predictive information, issued an additional recommendation. Critically, the game was structured such that the adviser's incentives to provide helpful or misleading information varied in time. Using a meta-Bayesian modeling framework, we found that the players' behavior was best explained by the deployment of hierarchical learning: they inferred upon the volatility of the advisers' intentions in order to optimize their predictions about the validity of their advice. Beyond learning, volatility estimates also affected the trial-by-trial variability of decisions: participants were more likely to rely on their estimates of advice accuracy for making choices when they believed that the adviser's intentions were presently stable. Finally, our model of the players' inference predicted the players' interpersonal reactivity index (IRI) scores, explicit ratings of the advisers' helpfulness and the advisers' self-reports on their chosen strategy. Overall, our results suggest that humans (i) employ hierarchical generative models to infer on the changing intentions of others, (ii) use volatility estimates to inform decision-making in social interactions, and (iii) integrate estimates of advice accuracy with non-social sources of information. The Bayesian framework presented here can quantify individual differences in these mechanisms from simple behavioral readouts and may prove useful in future clinical studies of maladaptive social cognition.

  5. Bayesian Inference for Time Trends in Parameter Values: Case Study for the Ageing PSA Network of the European Commission

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly; Albert Malkhasyan

    2010-06-01

    There is a nearly ubiquitous assumption in PSA that parameter values are at least piecewise-constant in time. As a result, Bayesian inference tends to incorporate many years of plant operation, over which there have been significant changes in plant operational and maintenance practices, plant management, etc. These changes can cause significant changes in parameter values over time; however, failure to perform Bayesian inference in the proper time-dependent framework can mask these changes. Failure to question the assumption of constant parameter values, and failure to perform Bayesian inference in the proper time-dependent framework were noted as important issues in NUREG/CR-6813, performed for the U. S. Nuclear Regulatory Commission’s Advisory Committee on Reactor Safeguards in 2003. That report noted that “industry lacks tools to perform time-trend analysis with Bayesian updating.” This paper describes an application of time-dependent Bayesian inference methods developed for the European Commission Ageing PSA Network. These methods utilize open-source software, implementing Markov chain Monte Carlo sampling. The paper also illustrates the development of a generic prior distribution, which incorporates multiple sources of generic data via weighting factors that address differences in key influences, such as vendor, component boundaries, conditions of the operating environment, etc.

  6. Message-Passing Receiver for OFDM Systems over Highly Delay-Dispersive Channels

    DEFF Research Database (Denmark)

    Barbu, Oana-Elena; Manchón, Carles Navarro; Rom, Christian

    2017-01-01

    Propagation channels with maximum excess delay exceeding the duration of the cyclic prefix (CP) in OFDM systems cause intercarrier and intersymbol interference which, unless accounted for, degrade the receiver performance. Using tools from Bayesian inference and sparse signal reconstruction, we...... and future wireless communications systems. By enabling the OFDM receiver experiencing these harsh conditions to locally cancel the interference, our design circumvents the spectral efficiency loss incurred by extending the CP duration, otherwise a straightforward solution. Furthermore, it sets the premises...

  7. Digital Receiver Phase Meter

    Science.gov (United States)

    Marcin, Martin; Abramovici, Alexander

    2008-01-01

    The software of a commercially available digital radio receiver has been modified to make the receiver function as a two-channel low-noise phase meter. This phase meter is a prototype in the continuing development of a phase meter for a system in which radiofrequency (RF) signals in the two channels would be outputs of a spaceborne heterodyne laser interferometer for detecting gravitational waves. The frequencies of the signals could include a common Doppler-shift component of as much as 15 MHz. The phase meter is required to measure the relative phases of the signals in the two channels at a sampling rate of 10 Hz at a root power spectral density digital receiver. The input RF signal is first fed to the input terminal of an analog-to-digital converter (ADC). To prevent aliasing errors in the ADC, the sampling rate must be at least twice the input signal frequency. The sampling rate of the ADC is governed by a sampling clock, which also drives a digital local oscillator (DLO), which is a direct digital frequency synthesizer. The DLO produces samples of sine and cosine signals at a programmed tuning frequency. The sine and cosine samples are mixed with (that is, multiplied by) the samples from the ADC, then low-pass filtered to obtain in-phase (I) and quadrature (Q) signal components. A digital signal processor (DSP) computes the ratio between the Q and I components, computes the phase of the RF signal (relative to that of the DLO signal) as the arctangent of this ratio, and then averages successive such phase values over a time interval specified by the user.

  8. A handbook for solar central receiver design

    Energy Technology Data Exchange (ETDEWEB)

    Falcone, P.K.

    1986-12-01

    This Handbook describes central receiver technology for solar thermal power plants. It contains a description and assessment of the major components in a central receiver system configured for utility scale production of electricity using Rankine-cycle steam turbines. It also describes procedures to size and optimize a plant and discussed examples from recent system analyses. Information concerning site selection criteria, cost estimation, construction, and operation and maintenance is also included, which should enable readers to perform design analyses for specific applications.

  9. Pressure difference receiving ears

    DEFF Research Database (Denmark)

    Michelsen, Axel; Larsen, Ole Næsbye

    2007-01-01

    of such pressure difference receiving ears have been hampered by lack of suitable experimental methods. In this review, we review the methods for collecting reliable data on the binaural directional cues at the eardrums, on how the eardrum vibrations depend on the direction of sound incidence, and on how sound...... waves behave in the air spaces leading to the interior surfaces of eardrums. A linear mathematical model with well-defined inputs is used for exploring how the directionality varies with the binaural directional cues and the amplitude and phase gain of the sound pathway to the inner surface...

  10. Sparse kernel learning with LASSO and Bayesian inference algorithm.

    Science.gov (United States)

    Gao, Junbin; Kwan, Paul W; Shi, Daming

    2010-03-01

    Kernelized LASSO (Least Absolute Selection and Shrinkage Operator) has been investigated in two separate recent papers [Gao, J., Antolovich, M., & Kwan, P. H. (2008). L1 LASSO and its Bayesian inference. In W. Wobcke, & M. Zhang (Eds.), Lecture notes in computer science: Vol. 5360 (pp. 318-324); Wang, G., Yeung, D. Y., & Lochovsky, F. (2007). The kernel path in kernelized LASSO. In International conference on artificial intelligence and statistics (pp. 580-587). San Juan, Puerto Rico: MIT Press]. This paper is concerned with learning kernels under the LASSO formulation via adopting a generative Bayesian learning and inference approach. A new robust learning algorithm is proposed which produces a sparse kernel model with the capability of learning regularized parameters and kernel hyperparameters. A comparison with state-of-the-art methods for constructing sparse regression models such as the relevance vector machine (RVM) and the local regularization assisted orthogonal least squares regression (LROLS) is given. The new algorithm is also demonstrated to possess considerable computational advantages. Copyright 2009 Elsevier Ltd. All rights reserved.

  11. A Bayesian subgroup analysis using collections of ANOVA models.

    Science.gov (United States)

    Liu, Jinzhong; Sivaganesan, Siva; Laud, Purushottam W; Müller, Peter

    2017-03-20

    We develop a Bayesian approach to subgroup analysis using ANOVA models with multiple covariates, extending an earlier work. We assume a two-arm clinical trial with normally distributed response variable. We also assume that the covariates for subgroup finding are categorical and are a priori specified, and parsimonious easy-to-interpret subgroups are preferable. We represent the subgroups of interest by a collection of models and use a model selection approach to finding subgroups with heterogeneous effects. We develop suitable priors for the model space and use an objective Bayesian approach that yields multiplicity adjusted posterior probabilities for the models. We use a structured algorithm based on the posterior probabilities of the models to determine which subgroup effects to report. Frequentist operating characteristics of the approach are evaluated using simulation. While our approach is applicable in more general cases, we mainly focus on the 2 × 2 case of two covariates each at two levels for ease of presentation. The approach is illustrated using a real data example.

  12. Bayesian Approach for Reliability Assessment of Sunshield Deployment on JWST

    Science.gov (United States)

    Kaminskiy, Mark P.; Evans, John W.; Gallo, Luis D.

    2013-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.

  13. MODELING INFORMATION SYSTEM AVAILABILITY BY USING BAYESIAN BELIEF NETWORK APPROACH

    Directory of Open Access Journals (Sweden)

    Semir Ibrahimović

    2016-03-01

    Full Text Available Modern information systems are expected to be always-on by providing services to end-users, regardless of time and location. This is particularly important for organizations and industries where information systems support real-time operations and mission-critical applications that need to be available on 24  7  365 basis. Examples of such entities include process industries, telecommunications, healthcare, energy, banking, electronic commerce and a variety of cloud services. This article presents a modified Bayesian Belief Network model for predicting information system availability, introduced initially by Franke, U. and Johnson, P. (in article “Availability of enterprise IT systems – an expert based Bayesian model”. Software Quality Journal 20(2, 369-394, 2012 based on a thorough review of several dimensions of the information system availability, we proposed a modified set of determinants. The model is parameterized by using probability elicitation process with the participation of experts from the financial sector of Bosnia and Herzegovina. The model validation was performed using Monte Carlo simulation.

  14. Bayesian analysis of censored response data in family-based genetic association studies.

    Science.gov (United States)

    Del Greco M, Fabiola; Pattaro, Cristian; Minelli, Cosetta; Thompson, John R

    2016-09-01

    Biomarkers are subject to censoring whenever some measurements are not quantifiable given a laboratory detection limit. Methods for handling censoring have received less attention in genetic epidemiology, and censored data are still often replaced with a fixed value. We compared different strategies for handling a left-censored continuous biomarker in a family-based study, where the biomarker is tested for association with a genetic variant, S, adjusting for a covariate, X. Allowing different correlations between X and S, we compared simple substitution of censored observations with the detection limit followed by a linear mixed effect model (LMM), Bayesian model with noninformative priors, Tobit model with robust standard errors, the multiple imputation (MI) with and without S in the imputation followed by a LMM. Our comparison was based on real and simulated data in which 20% and 40% censoring were artificially induced. The complete data were also analyzed with a LMM. In the MICROS study, the Bayesian model gave results closer to those obtained with the complete data. In the simulations, simple substitution was always the most biased method, the Tobit approach gave the least biased estimates at all censoring levels and correlation values, the Bayesian model and both MI approaches gave slightly biased estimates but smaller root mean square errors. On the basis of these results the Bayesian approach is highly recommended for candidate gene studies; however, the computationally simpler Tobit and the MI without S are both good options for genome-wide studies.

  15. Neural network classification - A Bayesian interpretation

    Science.gov (United States)

    Wan, Eric A.

    1990-01-01

    The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework.

  16. On local optima in learning bayesian networks

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Kocka, Tomas; Pena, Jose

    2003-01-01

    This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness...

  17. Automatic Thesaurus Construction Using Bayesian Networks.

    Science.gov (United States)

    Park, Young C.; Choi, Key-Sun

    1996-01-01

    Discusses automatic thesaurus construction and characterizes the statistical behavior of terms by using an inference network. Highlights include low-frequency terms and data sparseness, Bayesian networks, collocation maps and term similarity, constructing a thesaurus from a collocation map, and experiments with test collections. (Author/LRW)

  18. Diagnosis of Subtraction Bugs Using Bayesian Networks

    Science.gov (United States)

    Lee, Jihyun; Corter, James E.

    2011-01-01

    Diagnosis of misconceptions or "bugs" in procedural skills is difficult because of their unstable nature. This study addresses this problem by proposing and evaluating a probability-based approach to the diagnosis of bugs in children's multicolumn subtraction performance using Bayesian networks. This approach assumes a causal network relating…

  19. Face detection by aggregated Bayesian network classifiers

    NARCIS (Netherlands)

    Pham, T.V.; Worring, M.; Smeulders, A.W.M.

    2002-01-01

    A face detection system is presented. A new classification method using forest-structured Bayesian networks is used. The method is used in an aggregated classifier to discriminate face from non-face patterns. The process of generating non-face patterns is integrated with the construction of the aggr

  20. Bayesian Estimation of Thermonuclear Reaction Rates

    CERN Document Server

    Iliadis, Christian; Coc, Alain; Timmes, Frank; Starrfield, Sumner

    2016-01-01

    The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied in the past to this problem, all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extra-solar planets, gravitational waves, and type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present the first astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the d(p,$\\gamma$)$^3$He, $^3$He($^3$He,2p)$^4$He, and $^3$He($\\alpha$,$\\gamma$)$^7$Be reactions,...

  1. Bayesian Estimation of Thermonuclear Reaction Rates

    Science.gov (United States)

    Iliadis, C.; Anderson, K. S.; Coc, A.; Timmes, F. X.; Starrfield, S.

    2016-11-01

    The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p,γ)3He, 3He(3He,2p)4He, and 3He(α,γ)7Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.

  2. Non-Linear Approximation of Bayesian Update

    KAUST Repository

    Litvinenko, Alexander

    2016-06-23

    We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.

  3. Inverse Problems in a Bayesian Setting

    KAUST Repository

    Matthies, Hermann G.

    2016-02-13

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.

  4. Bayesian Analyses of Nonhomogeneous Autoregressive Processes

    Science.gov (United States)

    1986-09-01

    random coefficient autoregressive processes have a wide applicability in the analysis of economic, sociological, biological and industrial data...1980). Approximate Bayesian Methods. Trabajos Estadistica , Vol. 32, pp. 223-237. LIU, L. M. and G. C. TIAO (1980). Random Coefficient First

  5. Most frugal explanations in Bayesian networks

    NARCIS (Netherlands)

    Kwisthout, J.H.P.

    2015-01-01

    Inferring the most probable explanation to a set of variables, given a partial observation of the remaining variables, is one of the canonical computational problems in Bayesian networks, with widespread applications in AI and beyond. This problem, known as MAP, is computationally intractable (NP-ha

  6. Bayesian Averaging is Well-Temperated

    DEFF Research Database (Denmark)

    Hansen, Lars Kai

    2000-01-01

    Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation...

  7. Bayesian regularization of diffusion tensor images

    DEFF Research Database (Denmark)

    Frandsen, Jesper; Hobolth, Asger; Østergaard, Leif;

    2007-01-01

    several directions. The measured diffusion coefficients and thereby the diffusion tensors are subject to noise, leading to possibly flawed representations of the three dimensional fibre bundles. In this paper we develop a Bayesian procedure for regularizing the diffusion tensor field, fully utilizing...

  8. Incremental Bayesian Category Learning from Natural Language

    Science.gov (United States)

    Frermann, Lea; Lapata, Mirella

    2016-01-01

    Models of category learning have been extensively studied in cognitive science and primarily tested on perceptual abstractions or artificial stimuli. In this paper, we focus on categories acquired from natural language stimuli, that is, words (e.g., "chair" is a member of the furniture category). We present a Bayesian model that, unlike…

  9. Multisnapshot Sparse Bayesian Learning for DOA

    DEFF Research Database (Denmark)

    Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki

    2016-01-01

    The directions of arrival (DOA) of plane waves are estimated from multisnapshot sensor array data using sparse Bayesian learning (SBL). The prior for the source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters, the unknown variances (i.e., the source...... is discussed and evaluated competitively against LASSO (l(1)-regularization), conventional beamforming, and MUSIC....

  10. Bayesian calibration of car-following models

    NARCIS (Netherlands)

    Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.

    2010-01-01

    Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p

  11. Decision generation tools and Bayesian inference

    Science.gov (United States)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  12. Bayesian Networks: Aspects of Approximate Inference

    NARCIS (Netherlands)

    Bolt, J.H.

    2008-01-01

    A Bayesian network can be used to model consisely the probabilistic knowledge with respect to a given problem domain. Such a network consists of an acyclic directed graph in which the nodes represent stochastic variables, supplemented with probabilities indicating the strength of the influences betw

  13. Communication cost in Distributed Bayesian Belief Networks

    NARCIS (Netherlands)

    Gosliga, S.P. van; Maris, M.G.

    2005-01-01

    In this paper, two different methods for information fusionare compared with respect to communication cost. These are the lambda-pi and the junction tree approach as the probability computing methods in Bayesian networks. The analysis is done within the scope of large distributed networks of computi

  14. Bayesian Benefits for the Pragmatic Researcher

    NARCIS (Netherlands)

    Wagenmakers, E.-J.; Morey, R.D.; Lee, M.D.

    2016-01-01

    The practical advantages of Bayesian inference are demonstrated here through two concrete examples. In the first example, we wish to learn about a criminal’s IQ: a problem of parameter estimation. In the second example, we wish to quantify and track support in favor of the null hypothesis that Adam

  15. Bayesian Vector Autoregressions with Stochastic Volatility

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1996-01-01

    This paper proposes a Bayesian approach to a vector autoregression with stochastic volatility, where the multiplicative evolution of the precision matrix is driven by a multivariate beta variate.Exact updating formulas are given to the nonlinear filtering of the precision matrix.Estimation of the au

  16. Bayesian Estimation Supersedes the "t" Test

    Science.gov (United States)

    Kruschke, John K.

    2013-01-01

    Bayesian estimation for 2 groups provides complete distributions of credible values for the effect size, group means and their difference, standard deviations and their difference, and the normality of the data. The method handles outliers. The decision rule can accept the null value (unlike traditional "t" tests) when certainty in the estimate is…

  17. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  18. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.

  19. Von Neumann Was Not a Quantum Bayesian

    CERN Document Server

    Stacey, Blake C

    2014-01-01

    Wikipedia has claimed for over two years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.

  20. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...