WorldWideScience

Sample records for bayesian receiver operating

  1. Bayesian receiver operating characteristic estimation of multiple tests for diagnosis of bovine tuberculosis in Chadian cattle.

    Directory of Open Access Journals (Sweden)

    Borna Müller

    Full Text Available BACKGROUND: Bovine tuberculosis (BTB today primarily affects developing countries. In Africa, the disease is present essentially on the whole continent; however, little accurate information on its distribution and prevalence is available. Also, attempts to evaluate diagnostic tests for BTB in naturally infected cattle are scarce and mostly complicated by the absence of knowledge of the true disease status of the tested animals. However, diagnostic test evaluation in a given setting is a prerequisite for the implementation of local surveillance schemes and control measures. METHODOLOGY/PRINCIPAL FINDINGS: We subjected a slaughterhouse population of 954 Chadian cattle to single intra-dermal comparative cervical tuberculin (SICCT testing and two recently developed fluorescence polarization assays (FPA. Using a Bayesian modeling approach we computed the receiver operating characteristic (ROC curve of each diagnostic test, the true disease prevalence in the sampled population and the disease status of all sampled animals in the absence of knowledge of the true disease status of the sampled animals. In our Chadian setting, SICCT performed better if the cut-off for positive test interpretation was lowered from >4 mm (OIE standard cut-off to >2 mm. Using this cut-off, SICCT showed a sensitivity and specificity of 66% and 89%, respectively. Both FPA tests showed sensitivities below 50% but specificities above 90%. The true disease prevalence was estimated at 8%. Altogether, 11% of the sampled animals showed gross visible tuberculous lesions. However, modeling of the BTB disease status of the sampled animals indicated that 72% of the suspected tuberculosis lesions detected during standard meat inspections were due to other pathogens than Mycobacterium bovis. CONCLUSIONS/SIGNIFICANCE: Our results have important implications for BTB diagnosis in a high incidence sub-Saharan African setting and demonstrate the practicability of our Bayesian approach for

  2. Operational risk modelling and organizational learning in structured finance operations: a Bayesian network approach

    OpenAIRE

    Andrew Sanford; Imad Moosa

    2015-01-01

    This paper describes the development of a tool, based on a Bayesian network model, that provides posteriori predictions of operational risk events, aggregate operational loss distributions, and Operational Value-at-Risk, for a structured finance operations unit located within one of Australia's major banks. The Bayesian network, based on a previously developed causal framework, has been designed to model the smaller and more frequent, attritional operational loss events. Given the limited ava...

  3. Bayesian Methods for Measuring Operational Risk

    OpenAIRE

    Carol Alexander

    2000-01-01

    The likely imposition by regulators of minimum standards for capital to cover 'other risks' has been a driving force behind the recent interest in operational risk management. Much discussion has been centered on the form of capital charges for other risks. At the same time major banks are developing models to improve internal management of operational processes, new insurance products for operational risks are being designed and there is growing interest in alternative risk transfer, through...

  4. An Efficient Two-Fold Marginalized Bayesian Filter for Multipath Estimation in Satellite Navigation Receivers

    Directory of Open Access Journals (Sweden)

    Robertson Patrick

    2010-01-01

    Full Text Available Multipath is today still one of the most critical problems in satellite navigation, in particular in urban environments, where the received navigation signals can be affected by blockage, shadowing, and multipath reception. Latest multipath mitigation algorithms are based on the concept of sequential Bayesian estimation and improve the receiver performance by exploiting the temporal constraints of the channel dynamics. In this paper, we specifically address the problem of estimating and adjusting the number of multipath replicas that is considered by the receiver algorithm. An efficient implementation via a two-fold marginalized Bayesian filter is presented, in which a particle filter, grid-based filters, and Kalman filters are suitably combined in order to mitigate the multipath channel by efficiently estimating its time-variant parameters in a track-before-detect fashion. Results based on an experimentally derived set of channel data corresponding to a typical urban propagation environment are used to confirm the benefit of our novel approach.

  5. Receiver-based Recovery of Clipped OFDM Signals for PAPR Reduction: A Bayesian Approach

    OpenAIRE

    Ali, Anum; Al-Rabah, Abdullatif; Masood, Mudassir; Al-Naffouri, Tareq Y.

    2014-01-01

    Clipping is one of the simplest peak-to-average power ratio (PAPR) reduction schemes for orthogonal frequency division multiplexing (OFDM). Deliberately clipping the transmission signal degrades system performance, and clipping mitigation is required at the receiver for information restoration. In this work, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the spars...

  6. Modeling operational risks of the nuclear industry with Bayesian networks

    International Nuclear Information System (INIS)

    Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)

  7. Bayesian Recovery of Clipped OFDM Signals: A Receiver-based Approach

    KAUST Repository

    Al-Rabah, Abdullatif R.

    2013-05-01

    Recently, orthogonal frequency-division multiplexing (OFDM) has been adopted for high-speed wireless communications due to its robustness against multipath fading. However, one of the main fundamental drawbacks of OFDM systems is the high peak-to-average-power ratio (PAPR). Several techniques have been proposed for PAPR reduction. Most of these techniques require transmitter-based (pre-compensated) processing. On the other hand, receiver-based alternatives would save the power and reduce the transmitter complexity. By keeping this in mind, a possible approach is to limit the amplitude of the OFDM signal to a predetermined threshold and equivalently a sparse clipping signal is added. Then, estimating this clipping signal at the receiver to recover the original signal. In this work, we propose a Bayesian receiver-based low-complexity clipping signal recovery method for PAPR reduction. The method is able to i) effectively reduce the PAPR via simple clipping scheme at the transmitter side, ii) use Bayesian recovery algorithm to reconstruct the clipping signal at the receiver side by measuring part of subcarriers, iii) perform well in the absence of statistical information about the signal (e.g. clipping level) and the noise (e.g. noise variance), and at the same time iv is energy efficient due to its low complexity. Specifically, the proposed recovery technique is implemented in data-aided based. The data-aided method collects clipping information by measuring reliable 
data subcarriers, thus makes full use of spectrum for data transmission without the need for tone reservation. The study is extended further to discuss how to improve the recovery of the clipping signal utilizing some features of practical OFDM systems i.e., the oversampling and the presence of multiple receivers. Simulation results demonstrate the superiority of the proposed technique over other recovery algorithms. The overall objective is to show that the receiver-based Bayesian technique is highly

  8. Receiver-based recovery of clipped ofdm signals for papr reduction: A bayesian approach

    KAUST Repository

    Ali, Anum

    2014-01-01

    Clipping is one of the simplest peak-to-average power ratio reduction schemes for orthogonal frequency division multiplexing (OFDM). Deliberately clipping the transmission signal degrades system performance, and clipping mitigation is required at the receiver for information restoration. In this paper, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the sparsity rate and noise variance for enhanced recovery. At the same time, the proposed scheme is robust against inaccurate estimates of the clipping signal statistics. The undistorted phase property of the clipped signal, as well as the clipping likelihood, is utilized for enhanced reconstruction. Furthermore, motivated by the nature of modern OFDM-based communication systems, we extend our clipping reconstruction approach to multiple antenna receivers and multi-user OFDM.We also address the problem of channel estimation from pilots contaminated by the clipping distortion. Numerical findings are presented that depict favorable results for the proposed scheme compared to the established sparse reconstruction schemes.

  9. OFDM receiver for fast time-varying channels using block-sparse Bayesian learning

    DEFF Research Database (Denmark)

    Barbu, Oana-Elena; Manchón, Carles Navarro; Rom, Christian;

    2016-01-01

    We propose an iterative algorithm for OFDM receivers operating over fast time-varying channels. The design relies on the assumptions that the channel response can be characterized by a few non-negligible separable multipath components, and the temporal variation of each component gain can be well...

  10. Cost Analysis of different Operation strategies for falling particle receivers

    OpenAIRE

    Gobereit, Birgit; Amsbeck, Lars; Buck, Reiner; Singer, Csaba

    2015-01-01

    The potential for highly efficient and cost competitive solar energy collection at high temperatures drives the actual research and development activities for particle tower systems. One promising concept for particle receivers is the falling particle receiver. This paper is related to a particle receiver, in which falling ceramic particles form a particle curtain, which absorbs the concentrated solar radiation. Complex Operation strategies will result in higher receiver costs, for both...

  11. Modelling macroeconomic e ects and expert judgements in operational risk : a Bayesian approach

    OpenAIRE

    Capa Santos, Holger; Kratz, Marie; Mosquera Munoz, Franklin

    2012-01-01

    This work presents a contribution on operational risk under a general Bayesian context incorporating information on market risk pro le, experts and operational losses, taking into account the general macroeconomic environment as well. It aims at estimating a characteristic parameter of the distributions of the sources, market risk pro le, experts and operational losses, chosen here at a location parameter. It generalizes under more realistic conditions a study realized by Lambrigger, Shevchen...

  12. Dynamic Bayesian modeling for risk prediction in credit operations

    DEFF Research Database (Denmark)

    Borchani, Hanen; Martinez, Ana Maria; Masegosa, Andres;

    2015-01-01

    Our goal is to do risk prediction in credit operations, and as data is collected continuously and reported on a monthly basis, this gives rise to a streaming data classification problem. Our analysis reveals some practical problems that have not previously been thoroughly analyzed in the context ...

  13. Planetary micro-rover operations on Mars using a Bayesian framework for inference and control

    Science.gov (United States)

    Post, Mark A.; Li, Junquan; Quine, Brendan M.

    2016-03-01

    With the recent progress toward the application of commercially-available hardware to small-scale space missions, it is now becoming feasible for groups of small, efficient robots based on low-power embedded hardware to perform simple tasks on other planets in the place of large-scale, heavy and expensive robots. In this paper, we describe design and programming of the Beaver micro-rover developed for Northern Light, a Canadian initiative to send a small lander and rover to Mars to study the Martian surface and subsurface. For a small, hardware-limited rover to handle an uncertain and mostly unknown environment without constant management by human operators, we use a Bayesian network of discrete random variables as an abstraction of expert knowledge about the rover and its environment, and inference operations for control. A framework for efficient construction and inference into a Bayesian network using only the C language and fixed-point mathematics on embedded hardware has been developed for the Beaver to make intelligent decisions with minimal sensor data. We study the performance of the Beaver as it probabilistically maps a simple outdoor environment with sensor models that include uncertainty. Results indicate that the Beaver and other small and simple robotic platforms can make use of a Bayesian network to make intelligent decisions in uncertain planetary environments.

  14. Receiver Operating Characteristic Analysis for Detecting Explosives-related Threats

    Energy Technology Data Exchange (ETDEWEB)

    Oxley, Mark E; Venzin, Alexander M

    2012-11-14

    The Department of Homeland Security (DHS) and the Transportation Security Administration (TSA) are interested in developing a standardized testing procedure for determining the performance of candidate detection systems. This document outlines a potential method for judging detection system performance as well as determining if combining the information from a legacy system with a new system can signicantly improve performance. In this document, performance corresponds to the Neyman-Pearson criterion applied to the Receiver Operating Characteristic (ROC) curves of the detection systems in question. A simulation was developed to investigate how the amount of data provided by the vendor in the form of the ROC curve eects the performance of the combined detection system. Furthermore, the simulation also takes into account the potential eects of correlation and how this information can also impact the performance of the combined system.

  15. Receiver operating characteristic-curve limits of detection

    Energy Technology Data Exchange (ETDEWEB)

    Wysoczanski, Artur; Voigtman, Edward, E-mail: voigtman@chem.umass.edu

    2014-10-01

    Using a simple UV LED-excited ruby fluorescence measurement system, we demonstrate that it is easily possible to obtain unbiased detection limits, despite the system deliberately having non-linear response function and non-Gaussian noise. Even when the noise precision model is heteroscedastic, but otherwise only roughly linear, the receiver operating characteristic (ROC) method readily yields results that are in accordance with a priori canonical specifications of false positives and false negatives at the detection limit. The present work demonstrates that obtaining unbiased detection limits is not abstruse and need not be mathematically complicated. Rather, detection limits continue to serve a useful purpose as part of the characterization of chemical measurement systems. - Highlights: • Robust limits of detection using ROC-curves • Readily used even with less well-behaved systems • Easily extends to heteroscedastic systems.

  16. Using GOMS and Bayesian plan recognition to develop recognition models of operator behavior

    Science.gov (United States)

    Zaientz, Jack D.; DeKoven, Elyon; Piegdon, Nicholas; Wood, Scott D.; Huber, Marcus J.

    2006-05-01

    Trends in combat technology research point to an increasing role for uninhabited vehicles in modern warfare tactics. To support increased span of control over these vehicles human responsibilities need to be transformed from tedious, error-prone and cognition intensive operations into tasks that are more supervisory and manageable, even under intensely stressful conditions. The goal is to move away from only supporting human command of low-level system functions to intention-level human-system dialogue about the operator's tasks and situation. A critical element of this process is developing the means to identify when human operators need automated assistance and to identify what assistance they need. Toward this goal, we are developing an unmanned vehicle operator task recognition system that combines work in human behavior modeling and Bayesian plan recognition. Traditionally, human behavior models have been considered generative, meaning they describe all possible valid behaviors. Basing behavior recognition on models designed for behavior generation can offers advantages in improved model fidelity and reuse. It is not clear, however, how to reconcile the structural differences between behavior recognition and behavior modeling approaches. Our current work demonstrates that by pairing a cognitive psychology derived human behavior modeling approach, GOMS, with a Bayesian plan recognition engine, ASPRN, we can translate a behavior generation model into a recognition model. We will discuss the implications for using human performance models in this manner as well as suggest how this kind of modeling may be used to support the real-time control of multiple, uninhabited battlefield vehicles and other semi-autonomous systems.

  17. Application of Bayesian Belief networks to the human reliability analysis of an oil tanker operation focusing on collision accidents

    International Nuclear Information System (INIS)

    During the last three decades, several techniques have been developed for the quantitative study of human reliability. In the 1980s, techniques were developed to model systems by means of binary trees, which did not allow for the representation of the context in which human actions occur. Thus, these techniques cannot model the representation of individuals, their interrelationships, and the dynamics of a system. These issues make the improvement of methods for Human Reliability Analysis (HRA) a pressing need. To eliminate or at least attenuate these limitations, some authors have proposed modeling systems using Bayesian Belief Networks (BBNs). The application of these tools is expected to address many of the deficiencies in current approaches to modeling human actions with binary trees. This paper presents a methodology based on BBN for analyzing human reliability and applies this method to the operation of an oil tanker, focusing on the risk of collision accidents. The obtained model was used to determine the most likely sequence of hazardous events and thus isolate critical activities in the operation of the ship to study Internal Factors (IFs), Skills, and Management and Organizational Factors (MOFs) that should receive more attention for risk reduction.

  18. Posterior consistency and convergence rates for Bayesian inversion with hypoelliptic operators

    Science.gov (United States)

    Kekkonen, Hanne; Lassas, Matti; Siltanen, Samuli

    2016-08-01

    The Bayesian approach to inverse problems is studied in the case where the forward map is a linear hypoelliptic pseudodifferential operator and measurement error is additive white Gaussian noise. The measurement model for an unknown Gaussian random variable U(x,ω ) is {M}δ (y,ω )=A (U(x,ω ))+δ \\phantom{\\rule{.2mm}{0ex}}{ E }(y,ω ), where A is a finitely many orders smoothing linear hypoelliptic operator and δ \\gt 0 is the noise magnitude. The covariance operator C U of U is smoothing of order 2r, self-adjoint, injective and elliptic pseudodifferential operator. If { E } was taking values in L 2 then in Gaussian case solving the conditional mean (and maximum a posteriori) estimate is linked to solving the minimisation problem {T}δ ({m}δ )={{arg}{min}}u\\in {Hr} \\{\\parallel {Au}-{m}δ {\\parallel }{L2}2+{δ }2\\parallel {C}U-1/2u{\\parallel }{L2}2\\}. However, Gaussian white noise does not take values in L 2 but in {H}-s where s\\gt 0 is big enough. A modification of the above approach to solve the inverse problem is presented, covering the case of white Gaussian measurement noise. Furthermore, the convergence of the conditional mean estimate to the correct solution as δ \\to 0 is proven in appropriate function spaces using microlocal analysis. Also the frequentist posterior contractions rates are studied.

  19. 47 CFR 25.220 - Non-conforming transmit/receive earth station operations.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Non-conforming transmit/receive earth station... CARRIER SERVICES SATELLITE COMMUNICATIONS Technical Standards § 25.220 Non-conforming transmit/receive... operator acknowledging that the proposed operation of the subject non-conforming earth station with...

  20. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seongkeun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    The purpose of this paper is to confirm if Bayesian inference can properly reflect the situation awareness of real human operators, and find the difference between the situation of ideal and practical operators, and investigate the factors which contributes to those difference. As a results, human can not think like computer. If human can memorize all the information, and their thinking process is same to the CPU of computer, the results of these two experiments come out more than 99%. However the probability of finding right malfunction by humans are only 64.52% in simple experiment, and 51.61% in complex experiment. Cognition is the mental processing that includes the attention of working memory, comprehending and producing language, calculating, reasoning, problem solving, and decision making. There are many reasons why human thinking process is different with computer, but in this experiment, we suggest that the working memory is the most important factor. Humans have limited working memory which has only seven chunks capacity. These seven chunks are called magic number. If there are more than seven sequential information, people start to forget the previous information because their working memory capacity is running over. We can check how much working memory affects to the result through the simple experiment. Then what if we neglect the effect of working memory? The total number of subjects who have incorrect memory is 7 (subject 3, 5, 6, 7, 8, 15, 25). They could find the right malfunction if the memory hadn't changed because of lack of working memory. Then the probability of find correct malfunction will be increased to 87.10% from 64.52%. Complex experiment has similar result. In this case, eight subjects(1, 5, 8, 9, 15, 17, 18, 30) had changed the memory, and it affects to find the right malfunction. Considering it, then the probability would be (16+8)/31 = 77.42%.

  1. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    International Nuclear Information System (INIS)

    The purpose of this paper is to confirm if Bayesian inference can properly reflect the situation awareness of real human operators, and find the difference between the situation of ideal and practical operators, and investigate the factors which contributes to those difference. As a results, human can not think like computer. If human can memorize all the information, and their thinking process is same to the CPU of computer, the results of these two experiments come out more than 99%. However the probability of finding right malfunction by humans are only 64.52% in simple experiment, and 51.61% in complex experiment. Cognition is the mental processing that includes the attention of working memory, comprehending and producing language, calculating, reasoning, problem solving, and decision making. There are many reasons why human thinking process is different with computer, but in this experiment, we suggest that the working memory is the most important factor. Humans have limited working memory which has only seven chunks capacity. These seven chunks are called magic number. If there are more than seven sequential information, people start to forget the previous information because their working memory capacity is running over. We can check how much working memory affects to the result through the simple experiment. Then what if we neglect the effect of working memory? The total number of subjects who have incorrect memory is 7 (subject 3, 5, 6, 7, 8, 15, 25). They could find the right malfunction if the memory hadn't changed because of lack of working memory. Then the probability of find correct malfunction will be increased to 87.10% from 64.52%. Complex experiment has similar result. In this case, eight subjects(1, 5, 8, 9, 15, 17, 18, 30) had changed the memory, and it affects to find the right malfunction. Considering it, then the probability would be (16+8)/31 = 77.42%

  2. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    Science.gov (United States)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  3. Epistemic-Based Investigation of the Probability of Hazard Scenarios Using Bayesian Network for the Lifting Operation of Floating Objects

    Institute of Scientific and Technical Information of China (English)

    Ahmad Bahoo Toroody; Mohammad Mahdi Abaiee; Reza Gholamnia; Mohammad Javad Ketabdari

    2016-01-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types:the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  4. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    Science.gov (United States)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-07-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  5. Highly efficient Bayesian joint inversion for receiver-based data and its application to lithospheric structure beneath the southern Korean Peninsula

    Science.gov (United States)

    Kim, Seongryong; Dettmer, Jan; Rhie, Junkee; Tkalčić, Hrvoje

    2016-07-01

    With the deployment of extensive seismic arrays, systematic and efficient parameter and uncertainty estimation is of increasing importance and can provide reliable, regional models for crustal and upper-mantle structure. We present an efficient Bayesian method for the joint inversion of surface-wave dispersion and receiver-function data that combines trans-dimensional (trans-D) model selection in an optimization phase with subsequent rigorous parameter uncertainty estimation. Parameter and uncertainty estimation depend strongly on the chosen parametrization such that meaningful regional comparison requires quantitative model selection that can be carried out efficiently at several sites. While significant progress has been made for model selection (e.g. trans-D inference) at individual sites, the lack of efficiency can prohibit application to large data volumes or cause questionable results due to lack of convergence. Studies that address large numbers of data sets have mostly ignored model selection in favour of more efficient/simple estimation techniques (i.e. focusing on uncertainty estimation but employing ad-hoc model choices). Our approach consists of a two-phase inversion that combines trans-D optimization to select the most probable parametrization with subsequent Bayesian sampling for uncertainty estimation given that parametrization. The trans-D optimization is implemented here by replacing the likelihood function with the Bayesian information criterion (BIC). The BIC provides constraints on model complexity that facilitate the search for an optimal parametrization. Parallel tempering (PT) is applied as an optimization algorithm. After optimization, the optimal model choice is identified by the minimum BIC value from all PT chains. Uncertainty estimation is then carried out in fixed dimension. Data errors are estimated as part of the inference problem by a combination of empirical and hierarchical estimation. Data covariance matrices are estimated from

  6. A brief history of free-response receiver operating characteristic paradigm data analysis.

    Science.gov (United States)

    Chakraborty, Dev P

    2013-07-01

    In the receiver operating characteristic paradigm the observer assigns a single rating to each image and the location of the perceived abnormality, if any, is ignored. In the free-response receiver operating characteristic paradigm the observer is free to mark and rate as many suspicious regions as are considered clinically reportable. Credit for a correct localization is given only if a mark is sufficiently close to an actual lesion; otherwise, the observer's mark is scored as a location-level false positive. Until fairly recently there existed no accepted method for analyzing the resulting relatively unstructured data containing random numbers of mark-rating pairs per image. This report reviews the history of work in this field, which has now spanned more than five decades. It introduces terminology used to describe the paradigm, proposed measures of performance (figures of merit), ways of visualizing the data (operating characteristics), and software for analyzing free-response receiver operating characteristic studies.

  7. Risk-Based Analysis of Drilling Waste Handling Operations. Bayesian Network, Cost-effectiveness, and Operational Conditions

    OpenAIRE

    Ayele, Yonas Zewdu

    2016-01-01

    The papers of this thesis are not available in Munin. Paper I. Ayele YZ, Barabadi A, Barabady J.: A methodology for identification of a suitable drilling waste handling system in the Arctic region. (Manuscript). Paper II. Ayele YZ, Barabady J, Droguett EL.: Dynamic Bayesian network based risk assessment for Arctic offshore drilling waste handling practices. (Manuscript). Published version available in Journal of Offshore Mechanics and Arctic Engineering 138(5), 051302 (Jun 17, 2016) ...

  8. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    International Nuclear Information System (INIS)

    Bayesian methodology has been used widely used in various research fields. It is method of inference using Bayes' rule to update the estimation of probability for the certain hypothesis when additional evidences are acquired. According to the current researches, malfunction of nuclear power plant can be detected by using this Bayesian inference which consistently piles up the newly incoming data and updates its estimation. However, those researches are based on the assumption that people are doing like computer perfectly, which can be criticized and may cause a problem in real world application. Studies in cognitive psychology indicates that when the amount of information becomes larger, people can't save the whole data because people have limited memory capacity which is well known as working memory, and also they have attention problem. The purpose of this paper is to consider the psychological factors and confirm how much this working memory and attention will affect the resulted estimation based on the Bayesian inference. To confirm this, experiment on human is needed, and the tool of experiment is Compact Nuclear Simulator (CNS)

  9. Experiments for Evaluating Application of Bayesian Inference to Situation Awareness of Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seong Keun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2014-08-15

    Bayesian methodology has been used widely used in various research fields. It is method of inference using Bayes' rule to update the estimation of probability for the certain hypothesis when additional evidences are acquired. According to the current researches, malfunction of nuclear power plant can be detected by using this Bayesian inference which consistently piles up the newly incoming data and updates its estimation. However, those researches are based on the assumption that people are doing like computer perfectly, which can be criticized and may cause a problem in real world application. Studies in cognitive psychology indicates that when the amount of information becomes larger, people can't save the whole data because people have limited memory capacity which is well known as working memory, and also they have attention problem. The purpose of this paper is to consider the psychological factors and confirm how much this working memory and attention will affect the resulted estimation based on the Bayesian inference. To confirm this, experiment on human is needed, and the tool of experiment is Compact Nuclear Simulator (CNS)

  10. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    Science.gov (United States)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  11. Bayesian derivation of plasma equilibrium distribution function for tokamak scenarios and the associated Landau collision operator

    CERN Document Server

    Di Troia, Claudio

    2015-01-01

    A class of parametric distribution functions has been proposed in [C.DiTroia, Plasma Physics and Controlled Fusion,54,2012] as equilibrium distribution functions (EDFs) for charged particles in fusion plasmas, representing supra-thermal particles in anisotropic equilibria for Neutral Beam Injection, Ion Cyclotron Heating scenarios. Moreover, the EDFs can also represent nearly isotropic equilibria for Slowing-Down $alpha$ particles and core thermal plasma populations. These EDFs depend on constants of motion (COMs). Assuming an axisymmetric system with no equilibrium electric field, the EDF depends on the toroidal canonical momentum $P_\\phi$, the kinetic energy $w$ and the magnetic moment \\mu. In the present work, the EDFs are obtained from first principles and general hypothesis. The derivation is probabilistic and makes use of the Bayes' Theorem. The bayesian argument allows us to describe how far from the prior probability distribution function (pdf), e.g. Maxwellian, the plasma is, based on the information...

  12. Long Length Contaminated Equipment Retrieval System Receiver Trailer and Transport Trailer Operations and Maintenance Manual

    Energy Technology Data Exchange (ETDEWEB)

    DALE, R.N.

    2000-05-01

    A system to accommodate the removal of long-length contaminated equipment (LLCE) from Hanford underground radioactive waste storage tanks was designed, procured, and demonstrated, via a project activity during the 1990s. The system is the Long Length Contaminated Equipment Removal System (LLCERS). LLCERS will be maintained and operated by Tank Farms Engineering and Operations organizations and other varied projects having a need for the system. The responsibility for the operation and maintenance of the LLCERS Receiver Trailer (RT) and Transport Trailer (TT) resides with the RPP Characterization Project Operations organization. The purpose of this document is to provide vendor supplied operating and maintenance (O & M) information for the RT and TT in a readily retrievable form. This information is provided this way instead of in a vendor information (VI) file to maintain configuration control of the operations baseline as described in RPP-6085, ''Configuration Management Plan for Long Length Contaminated Equipment Receiver and Transport Trailers''. Additional Operations Baseline documents are identified in RPP-6085.

  13. Long Length Contaminated Equipment Retrieval System Receiver Trailer and Transport Trailer Operations and Maintenance Manual

    International Nuclear Information System (INIS)

    A system to accommodate the removal of long-length contaminated equipment (LLCE) from Hanford underground radioactive waste storage tanks was designed, procured, and demonstrated, via a project activity during the 1990s. The system is the Long Length Contaminated Equipment Removal System (LLCERS). LLCERS will be maintained and operated by Tank Farms Engineering and Operations organizations and other varied projects having a need for the system. The responsibility for the operation and maintenance of the LLCERS Receiver Trailer (RT) and Transport Trailer (TT) resides with the RPP Characterization Project Operations organization. The purpose of this document is to provide vendor supplied operating and maintenance (O and M) information for the RT and TT in a readily retrievable form. This information is provided this way instead of in a vendor information (VI) file to maintain configuration control of the operations baseline as described in RPP-6085, ''Configuration Management Plan for Long Length Contaminated Equipment Receiver and Transport Trailers''. Additional Operations Baseline documents are identified in RPP-6085

  14. Receiver Operating Characteristic Curve Analysis of Beach Water Quality Indicator Variables

    OpenAIRE

    Morrison, Ann Michelle; Coughlin, Kelly; Shine, James P.; Coull, Brent A.; Rex, Andrea C.

    2003-01-01

    Receiver operating characteristic (ROC) curve analysis is a simple and effective means to compare the accuracies of indicator variables of bacterial beach water quality. The indicator variables examined in this study were previous day's Enterococcus density and antecedent rainfall at 24, 48, and 96 h. Daily Enterococcus densities and 15-min rainfall values were collected during a 5-year (1996 to 2000) study of four Boston Harbor beaches. The indicator variables were assessed for their ability...

  15. Using a Bayesian Probabilistic Forecasting Model to Analyze the Uncertainty in Real-Time Dynamic Control of the Flood Limiting Water Level for Reservoir Operation

    DEFF Research Database (Denmark)

    Liu, Dedi; Li, Xiang; Guo, Shenglian;

    2015-01-01

    Dynamic control of the flood limiting water level (FLWL) is a valuable and effective way to maximize the benefits from reservoir operation without exceeding the design risk. In order to analyze the impacts of input uncertainty, a Bayesian forecasting system (BFS) is adopted. Applying quantile wat...

  16. Structural Design Considerations for Tubular Power Tower Receivers Operating at 650 Degrees C: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Neises, T. W.; Wagner, M. J.; Gray, A. K.

    2014-04-01

    Research of advanced power cycles has shown supercritical carbon dioxide power cycles may have thermal efficiency benefits relative to steam cycles at temperatures around 500 - 700 degrees C. To realize these benefits for CSP, it is necessary to increase the maximum outlet temperature of current tower designs. Research at NREL is investigating a concept that uses high-pressure supercritical carbon dioxide as the heat transfer fluid to achieve a 650 degrees C receiver outlet temperature. At these operating conditions, creep becomes an important factor in the design of a tubular receiver and contemporary design assumptions for both solar and traditional boiler applications must be revisited and revised. This paper discusses lessons learned for high-pressure, high-temperature tubular receiver design. An analysis of a simplified receiver tube is discussed, and the results show the limiting stress mechanisms in the tube and the impact on the maximum allowable flux as design parameters vary. Results of this preliminary analysis indicate an underlying trade-off between tube thickness and the maximum allowable flux on the tube. Future work will expand the scope of design variables considered and attempt to optimize the design based on cost and performance metrics.

  17. Dual channel 115 and 230 GHz SIS receivers in operation at the Owens Valley Radio Observatory

    International Nuclear Information System (INIS)

    The Owens Valley Radio Observatory millimeter-wave interferometer array is presently operating with dual channel SIS tunnel junction receivers. The first channel covers the frequency range from 85 to 120 GHz and the second channel covers the frequency range from 200 to 300 GHz. The mixers consist of a corrugated feedhorn, single-stage circular to rectangular waveguide transition, reduced-height waveguide with an SIS junction mounted across the E-plane and a non-contacting backshort. The mixer block has a built-in RF choke for the IF signal path which is designed to present a short circuit to the junction at frequencies above the 2 GHz IF frequency. The small area (<1 μm/sup 2/) PbInAu-native oxide-PbAu SIS tunnel junctions are fabricated using a bridge lift-off technique. The LO power is provided by Gunn oscillators followed by doublers or triplers. The receivers in the 85 to 120 GHz band have noise temperatures of <100 K, while the receivers in the 200 to 300 GHz band have noise temperatures in the range from 200 to 300 K. These dual channel receivers are mounted in 4.5 K closed cycle refrigerators. They are in continuous use on the three element millimeter-wavelength interferometer array

  18. Operator decision support system for integrated wastewater management including wastewater treatment plants and receiving water bodies.

    Science.gov (United States)

    Kim, Minsoo; Kim, Yejin; Kim, Hyosoo; Piao, Wenhua; Kim, Changwon

    2016-06-01

    An operator decision support system (ODSS) is proposed to support operators of wastewater treatment plants (WWTPs) in making appropriate decisions. This system accounts for water quality (WQ) variations in WWTP influent and effluent and in the receiving water body (RWB). The proposed system is comprised of two diagnosis modules, three prediction modules, and a scenario-based supporting module (SSM). In the diagnosis modules, the WQs of the influent and effluent WWTP and of the RWB are assessed via multivariate analysis. Three prediction modules based on the k-nearest neighbors (k-NN) method, activated sludge model no. 2d (ASM2d) model, and QUAL2E model are used to forecast WQs for 3 days in advance. To compare various operating alternatives, SSM is applied to test various predetermined operating conditions in terms of overall oxygen transfer coefficient (Kla), waste sludge flow rate (Qw), return sludge flow rate (Qr), and internal recycle flow rate (Qir). In the case of unacceptable total phosphorus (TP), SSM provides appropriate information for the chemical treatment. The constructed ODSS was tested using data collected from Geumho River, which was the RWB, and S WWTP in Daegu City, South Korea. The results demonstrate the capability of the proposed ODSS to provide WWTP operators with more objective qualitative and quantitative assessments of WWTP and RWB WQs. Moreover, the current study shows that ODSS, using data collected from the study area, can be used to identify operational alternatives through SSM at an integrated urban wastewater management level. PMID:26893178

  19. Operator decision support system for integrated wastewater management including wastewater treatment plants and receiving water bodies.

    Science.gov (United States)

    Kim, Minsoo; Kim, Yejin; Kim, Hyosoo; Piao, Wenhua; Kim, Changwon

    2016-06-01

    An operator decision support system (ODSS) is proposed to support operators of wastewater treatment plants (WWTPs) in making appropriate decisions. This system accounts for water quality (WQ) variations in WWTP influent and effluent and in the receiving water body (RWB). The proposed system is comprised of two diagnosis modules, three prediction modules, and a scenario-based supporting module (SSM). In the diagnosis modules, the WQs of the influent and effluent WWTP and of the RWB are assessed via multivariate analysis. Three prediction modules based on the k-nearest neighbors (k-NN) method, activated sludge model no. 2d (ASM2d) model, and QUAL2E model are used to forecast WQs for 3 days in advance. To compare various operating alternatives, SSM is applied to test various predetermined operating conditions in terms of overall oxygen transfer coefficient (Kla), waste sludge flow rate (Qw), return sludge flow rate (Qr), and internal recycle flow rate (Qir). In the case of unacceptable total phosphorus (TP), SSM provides appropriate information for the chemical treatment. The constructed ODSS was tested using data collected from Geumho River, which was the RWB, and S WWTP in Daegu City, South Korea. The results demonstrate the capability of the proposed ODSS to provide WWTP operators with more objective qualitative and quantitative assessments of WWTP and RWB WQs. Moreover, the current study shows that ODSS, using data collected from the study area, can be used to identify operational alternatives through SSM at an integrated urban wastewater management level.

  20. Combining classifiers using their receiver operating characteristics and maximum likelihood estimation.

    Science.gov (United States)

    Haker, Steven; Wells, William M; Warfield, Simon K; Talos, Ion-Florin; Bhagwat, Jui G; Goldberg-Zimring, Daniel; Mian, Asim; Ohno-Machado, Lucila; Zou, Kelly H

    2005-01-01

    In any medical domain, it is common to have more than one test (classifier) to diagnose a disease. In image analysis, for example, there is often more than one reader or more than one algorithm applied to a certain data set. Combining of classifiers is often helpful, but determining the way in which classifiers should be combined is not trivial. Standard strategies are based on learning classifier combination functions from data. We describe a simple strategy to combine results from classifiers that have not been applied to a common data set, and therefore can not undergo this type of joint training. The strategy, which assumes conditional independence of classifiers, is based on the calculation of a combined Receiver Operating Characteristic (ROC) curve, using maximum likelihood analysis to determine a combination rule for each ROC operating point. We offer some insights into the use of ROC analysis in the field of medical imaging. PMID:16685884

  1. Combining Classifiers Using Their Receiver Operating Characteristics and Maximum Likelihood Estimation*

    Science.gov (United States)

    Haker, Steven; Wells, William M.; Warfield, Simon K.; Talos, Ion-Florin; Bhagwat, Jui G.; Goldberg-Zimring, Daniel; Mian, Asim; Ohno-Machado, Lucila; Zou, Kelly H.

    2010-01-01

    In any medical domain, it is common to have more than one test (classifier) to diagnose a disease. In image analysis, for example, there is often more than one reader or more than one algorithm applied to a certain data set. Combining of classifiers is often helpful, but determining the way in which classifiers should be combined is not trivial. Standard strategies are based on learning classifier combination functions from data. We describe a simple strategy to combine results from classifiers that have not been applied to a common data set, and therefore can not undergo this type of joint training. The strategy, which assumes conditional independence of classifiers, is based on the calculation of a combined Receiver Operating Characteristic (ROC) curve, using maximum likelihood analysis to determine a combination rule for each ROC operating point. We offer some insights into the use of ROC analysis in the field of medical imaging. PMID:16685884

  2. Sources of expertise in transportation planning, management, and operations: Information received as of September 25, 1987

    International Nuclear Information System (INIS)

    The DOE Office of Storage and Transportation Systems is responsible for the development and management of a transportation system to provide all the necessary services for the transportation of the spent fuel and wastes from reactor sites to repositories. DOE/ORO has requested Oak Ridge Associated Universities (ORAU) to assist DOE in developing rosters of sources of transportation expertise in: (1) carrier operations; (2) transportation management, planning, and logistics; (3) transportation equipment; (4) transportation facilities design and operation; (5) vehicle safety; and (6) transportation operations quality assurance; as related to truck, rail, barge, and intermodal transportation. Persons or organizations with experience in shipping of non-hazardous materials, spent nuclear fuel, other radioactive materials, and/or other hazardous materials were included in the information system. A mailed inquiry was sent to over 2300 potential sources of transportation expertise. Responses were received from 207 persons and 254 organizations. Section 1 contains the identification numbers of the individuals and organizations that responded. Section 2 contains identification codes, names, addresses, and phone numbers of each of the individual and organization respondents. The reader can refer to Section 2 for the name and address of the respondents for the identification codes listed for each technical area/experience base in Section 1

  3. 25 CFR 47.3 - How does a Bureau-operated school find out how much funding it will receive?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false How does a Bureau-operated school find out how much... EDUCATION UNIFORM DIRECT FUNDING AND SUPPORT FOR BUREAU-OPERATED SCHOOLS § 47.3 How does a Bureau-operated school find out how much funding it will receive? The Office of Indian Education Programs (OIEP)...

  4. Application of Receiver Operating Characteristic (ROC Curves for Explosives Detection Using Different Sampling and Detection Techniques

    Directory of Open Access Journals (Sweden)

    Mimy Young

    2013-12-01

    Full Text Available Reported for the first time are receiver operating characteristic (ROC curves constructed to describe the performance of a sorbent-coated disk, planar solid phase microextraction (PSPME unit for non-contact sampling of a variety of volatiles. The PSPME is coupled to ion mobility spectrometers (IMSs for the detection of volatile chemical markers associated with the presence of smokeless powders, model systems of explosives containing diphenylamine (DPA, 2,4-dinitrotoluene (2,4-DNT and nitroglycerin (NG as the target analytes. The performance of the PSPME-IMS was compared with the widely accepted solid-phase microextraction (SPME, coupled to a GC-MS. A set of optimized sampling conditions for different volume containers (1–45 L with various sample amounts of explosives, were studied in replicates (n = 30 to determine the true positive rates (TPR and false positive detection rates (FPR for the different scenarios. These studies were obtained in order to construct the ROC curves for two IMS instruments (a bench-top and field-portable system and a bench top GC-MS system in low and high clutter environments. Both static and dynamic PSPME sampling were studied in which 10–500 mg quantities of smokeless powders were detected within 10 min of static sampling and 1 min of dynamic sampling.

  5. Recollection is a continuous process: Evidence from plurality memory receiver operating characteristics.

    Science.gov (United States)

    Slotnick, Scott D; Jeye, Brittany M; Dodson, Chad S

    2016-01-01

    Is recollection a continuous/graded process or a threshold/all-or-none process? Receiver operating characteristic (ROC) analysis can answer this question as the continuous model and the threshold model predict curved and linear recollection ROCs, respectively. As memory for plurality, an item's previous singular or plural form, is assumed to rely on recollection, the nature of recollection can be investigated by evaluating plurality memory ROCs. The present study consisted of four experiments. During encoding, words (singular or plural) or objects (single/singular or duplicate/plural) were presented. During retrieval, old items with the same plurality or different plurality were presented. For each item, participants made a confidence rating ranging from "very sure old", which was correct for same plurality items, to "very sure new", which was correct for different plurality items. Each plurality memory ROC was the proportion of same versus different plurality items classified as "old" (i.e., hits versus false alarms). Chi-squared analysis revealed that all of the plurality memory ROCs were adequately fit by the continuous unequal variance model, whereas none of the ROCs were adequately fit by the two-high threshold model. These plurality memory ROC results indicate recollection is a continuous process, which complements previous source memory and associative memory ROC findings.

  6. Design of a receiver operating characteristic (ROC) study of 10:1 lossy image compression

    Science.gov (United States)

    Collins, Cary A.; Lane, David; Frank, Mark S.; Hardy, Michael E.; Haynor, David R.; Smith, Donald V.; Parker, James E.; Bender, Gregory N.; Kim, Yongmin

    1994-04-01

    The digital archiving system at Madigan Army Medical Center (MAMC) uses a 10:1 lossy data compression algorithm for most forms of computed radiography. A systematic study on the potential effect of lossy image compression on patient care has been initiated with a series of studies focused on specific diagnostic tasks. The studies are based upon the receiver operating characteristic (ROC) method of analysis for diagnostic systems. The null hypothesis is that observer performance with approximately 10:1 compressed and decompressed images is not different from using original, uncompressed images for detecting subtle pathologic findings seen on computed radiographs of bone, chest, or abdomen, when viewed on a high-resolution monitor. Our design involves collecting cases from eight pathologic categories. Truth is determined by committee using confirmatory studies performed during routine clinical practice whenever possible. Software has been developed to aid in case collection and to allow reading of the cases for the study using stand-alone Siemens Litebox workstations. Data analysis uses two methods, ROC analysis and free-response ROC (FROC) methods. This study will be one of the largest ROC/FROC studies of its kind and could benefit clinical radiology practice using PACS technology. The study design and results from a pilot FROC study are presented.

  7. Visualization of the significance of Receiver Operating Characteristics based on confidence ellipses

    Science.gov (United States)

    Sarlis, Nicholas V.; Christopoulos, Stavros-Richard G.

    2014-03-01

    The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Catalogue identifier: AERY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 11511 No. of bytes in distributed program, including test data, etc.: 72906 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any computer supporting a GNU FORTRAN compiler. Operating system: Linux, MacOS, Windows. RAM: 1Mbyte Classification: 4.13, 9, 14. Nature of problem: The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Solution method: Using the statistics of random binary predictions for a given value of the predictor threshold ɛt, one can construct the corresponding confidence ellipses. The envelope of these corresponding confidence ellipses is estimated when

  8. Receiver operating characteristic analysis for the detection of simulated microcalcifications on mammograms using hardcopy images

    Energy Technology Data Exchange (ETDEWEB)

    Lai, C J [Department of Imaging Physics, University of Texas M D Anderson Cancer Center, Houston, Texas (United States); Shaw, Chris C [Department of Imaging Physics, University of Texas M D Anderson Cancer Center, Houston, Texas (United States); Whitman, Gary J [Department of Diagnostic Radiology, University of Texas M D Anderson Cancer Center, Houston, Texas (United States); Yang, Wei T [Department of Diagnostic Radiology, University of Texas M D Anderson Cancer Center, Houston, Texas (United States); Dempsey, Peter J [Department of Diagnostic Radiology, University of Texas M D Anderson Cancer Center, Houston, Texas (United States); Nguyen, Victoria [Department of Diagnostic Radiology, University of Texas M D Anderson Cancer Center, Houston, Texas (United States); Ice, Mary F [Department of Diagnostic Radiology, University of Texas M D Anderson Cancer Center, Houston, Texas (United States)

    2006-08-21

    The aim of this study was to compare mammography systems based on three different detectors-a conventional screen-film (SF) combination, an a-Si/CsI flat-panel (FP)-based detector, and a charge-coupled device (CCD)-based x-ray phosphor-based detector-for their performance in detecting simulated microcalcifications (MCs). 112-150 {mu}m calcium carbonate grains were used to simulate MCs and were overlapped with a slab phantom of simulated 50% adipose/50% glandular breast tissue-equivalent material referred to as the uniform background. For the tissue structure background, 200-250 {mu}m calcium carbonate grains were used and overlapped with an anthropomorphic breast phantom. All MC phantom images were acquired with and without magnification (1.8X). The hardcopy images were reviewed by five mammographers. A five-point confidence level rating was used to score each detection task. Receiver operating characteristic (ROC) analysis was performed, and the areas under the ROC curves (A{sub z}s) were used to compare the performances of the three mammography systems under various conditions. The results showed that, with a uniform background and contact images, the FP-based system performed significantly better than the SF and the CCD-based systems. For magnified images with a uniform background, the SF and the FP-based systems performed equally well and significantly better than the CCD-based system. With tissue structure background and contact images, the SF system performed significantly better than the FP and the CCD-based systems. With magnified images and a tissue structure background, the SF and the CCD-based systems performed equally well and significantly better than the FP-based system. In the detection of MCs in the fibroglandular and the heterogeneously dense regions, no significant differences were found except that the SF system performed significantly better than the CCD-based system in the fibroglandular regions for the contact images.

  9. Full receiver operating characteristic curve estimation using two alternative forced choice studies.

    Science.gov (United States)

    Massanes, Francesc; Brankov, Jovan G

    2016-01-01

    Task-based medical image quality is typically measured by the degree to which a human observer can perform a diagnostic task in a psychophysical human observer study. During a typical study, an observer is asked to provide a numerical score quantifying his confidence as to whether an image contains a diagnostic marker or not. Such scores are then used to measure the observers' diagnostic accuracy, summarized by the receiver operating characteristic (ROC) curve and the area under ROC curve. These types of human studies are difficult to arrange, costly, and time consuming. In addition, human observers involved in this type of study should be experts on the image genre to avoid inconsistent scoring through the lengthy study. In two-alternative forced choice (2AFC) studies, known to be faster, two images are compared simultaneously and a single indicator is given. Unfortunately, the 2AFC approach cannot lead to a full ROC curve or a set of image scores. The aim of this work is to propose a methodology in which multiple rounds of the 2AFC studies are used to re-estimate an image confidence score (a.k.a. rating, ranking) and generate the full ROC curve. In the proposed approach, we treat image confidence score as an unknown rating that needs to be estimated and 2AFC as a two-player match game. To achieve this, we use the ELO rating system, which is used for calculating the relative skill levels of players in competitor-versus-competitor games such as chess. The proposed methodology is not limited to ELO, and other rating methods such as TrueSkill™, Chessmetrics, or Glicko can be also used. The presented results, using simulated data, indicate that a full ROC curve can be recovered using several rounds of 2AFC studies and that the best pairing strategy starts with the first round of pairing abnormal versus normal images (as in the classical 2AFC approach) followed by a number of rounds using random pairing. In addition, the proposed method was tested in a pilot human

  10. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  11. Bayesian statistics

    OpenAIRE

    Draper, D.

    2001-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  12. The precision--recall curve overcame the optimism of the receiver operating characteristic curve in rare diseases

    DEFF Research Database (Denmark)

    Ozenne, Brice; Subtil, Fabien; Maucort-Boulch, Delphine

    2015-01-01

    OBJECTIVES: Compare the area under the receiver operating characteristic curve (AUC) vs. the area under the precision-recall curve (AUPRC) in summarizing the performance of a diagnostic biomarker according to the disease prevalence. STUDY DESIGN AND SETTING: A simulation study was performed...

  13. Receiver-operating characteristic curves for somatic cell scores and California mastitis test in Valle del Be lice dairy sheep

    NARCIS (Netherlands)

    Riggio, V.; Pesce, L.L.; Morreale, S.; Portolano, B.

    2013-01-01

    Using receiver-operating characteristic (ROC) curve methodology this study was designed to assess the diagnostic effectiveness of somatic cell count (SCC) and the California mastitis test (CMT) in Valle del Belice sheep, and to propose and evaluate threshold values for those tests that would optimal

  14. Assessing the Classification Accuracy of Early Numeracy Curriculum-Based Measures Using Receiver Operating Characteristic Curve Analysis

    Science.gov (United States)

    Laracy, Seth D.; Hojnoski, Robin L.; Dever, Bridget V.

    2016-01-01

    Receiver operating characteristic curve (ROC) analysis was used to investigate the ability of early numeracy curriculum-based measures (EN-CBM) administered in preschool to predict performance below the 25th and 40th percentiles on a quantity discrimination measure in kindergarten. Areas under the curve derived from a sample of 279 students ranged…

  15. Confidence Intervals for the Probability of Superiority Effect Size Measure and the Area under a Receiver Operating Characteristic Curve

    Science.gov (United States)

    Ruscio, John; Mullen, Tara

    2012-01-01

    It is good scientific practice to the report an appropriate estimate of effect size and a confidence interval (CI) to indicate the precision with which a population effect was estimated. For comparisons of 2 independent groups, a probability-based effect size estimator (A) that is equal to the area under a receiver operating characteristic curve…

  16. Determining Cutoff Scores on a Developmental Screening Measure: Use of Receiver Operating Characteristics and Item Response Theory

    Science.gov (United States)

    Yovanoff, P.; Squires, J.

    2006-01-01

    Two different theoretical approaches were compared to determine the optimal cutoff scores for the Ages and Stages Questionnaires: Social-Emotional (ASQ: SE), a social-emotional screening test. Cutoff scores based on statistical decision theory modeling, Receiver Operator Characteristics (ROC), were compared with cutoff scores obtained using Item…

  17. A Bayesian Network approach to the evaluation of building design and its consequences for employee performance and operational costs

    DEFF Research Database (Denmark)

    Jensen, Kasper Lynge; Toftum, Jørn; Friis-Hansen, Peter

    2009-01-01

    building design. In this paper, focus will be on the effects of temperature on mental performance and not on other indoor climate factors. A total economic comparison of six different building designs, four located in northern Europe and two in Los Angeles, USA, was performed. The results indicate...... that investments in improved indoor thermal conditions can be justified economically in most cases. The Bayesian Network provides a reliable platform using probabilities for modelling the complexity while estimating the effect of indoor climate factors on human beings, due to the different ways in which humans...

  18. Operating conditions of an open and direct solar thermal Brayton cycle with optimised cavity receiver and recuperator

    International Nuclear Information System (INIS)

    The small-scale open and direct solar thermal Brayton cycle with recuperator has several advantages, including low cost, low operation and maintenance costs and it is highly recommended. The main disadvantages of this cycle are the pressure losses in the recuperator and receiver, turbomachine efficiencies and recuperator effectiveness, which limit the net power output of such a system. The irreversibilities of the solar thermal Brayton cycle are mainly due to heat transfer across a finite temperature difference and fluid friction. In this paper, thermodynamic optimisation is applied to concentrate on these disadvantages in order to optimise the receiver and recuperator and to maximise the net power output of the system at various steady-state conditions, limited to various constraints. The effects of wind, receiver inclination, rim angle, atmospheric temperature and pressure, recuperator height, solar irradiance and concentration ratio on the optimum geometries and performance were investigated. The dynamic trajectory optimisation method was applied. Operating points of a standard micro-turbine operating at its highest compressor efficiency and a parabolic dish concentrator diameter of 16 m were considered. The optimum geometries, minimum irreversibility rates and maximum receiver surface temperatures of the optimised systems are shown. For an environment with specific conditions and constraints, there exists an optimum receiver and recuperator geometry so that the system produces maximum net power output. -- Highlights: → Optimum geometries exist such that the system produces maximum net power output. → Optimum operating conditions are shown. → Minimum irreversibility rates and minimum entropy generation rates are shown. → Net power output was described in terms of total entropy generation rate. → Effects such as wind, recuperator height and irradiance were investigated.

  19. 10-MWe solar-thermal central-receiver pilot plant. Operating and maintenance manual

    Energy Technology Data Exchange (ETDEWEB)

    1979-08-01

    Information required to perform the initial program loading and operation of the Heliostat Array Controller (HAC) is provided. Operating activities are described as required for heliostat control. All computer console command steps, from power up to power down are described. Detailed steps are provided to wake up the system and direct heliostat beams to standby, on target, standby to stow and power down. Maintenance requirements (preventive and corrective), reparability (reparable - non-reparable decisions), spares identification, spares storage location, replacement levels, replacement location and repair location are established. Individual system breakdown block diagrams are provided for each system/assembly/subassembly. Maintenance and repair description sheets are provided for each maintenance significant item. The manual provides support of the following equipment: (a) helostat assembly; (b) heliostat control assembly; and (c) maintenance and installation equipment. The safety requirements for the operating and maintenance functions are established. These procedures will assist in eliminating or controlling the accident potentials caused by human error, environment, or component malfunctions or interactions that could result in major injury or fatality to operating or visiting personnel, or damage to subsystem components or support equipment. These procedures are for normal and test operating conditions and emergency situations, and apply to all Martin Marietta Corporation, governmental, operating and visitor personnel. (LEW)

  20. Detector evaluation for improved situational awareness: Receiver operator characteristic curve based

    NARCIS (Netherlands)

    Wuijckhuijse, A.L. van; Nieuwenhuizen, M.S.

    2016-01-01

    In military and civilian operations good situational awareness is a prerequisite to make proper decisions. The situational awareness is among others based upon intelligence, threat analysis and detection, altogether element of the so-called DIM (detection, identification, monitoring) system. In case

  1. Final environmental statement related to the operation of the Barnwell Fuel Receiving and Storage Station (Docket No. 70-1729)

    International Nuclear Information System (INIS)

    The proposed action is to issue a materials license, pursuant to 10 CFR Parts 30, 40 and 70 of the Commission's regulations, authorizing Allied-General Nuclear Services to receive and handle fuel casks containing spent reactor fuel elements and to store spent reactor fuel at the Barnwell Nuclear Fuel Plant (BNFP), in the Barnwell Fuel Receiving and Storage Station (BFRSS). The BFRSS is a part of, and contiguous to, the BNFP-Separations Facility which is being constructed on a small portion of a 1700 acre site about six miles west of the city of Barnwell in Barnwell County, South Carolina. Construction of the BFRSS facility has been completed and the BNFP Separations Facility is more than 90% complete. A uranium Hexafluoride Facility is being constructed on the same site, and a Plutonium Product Facility is proposed to be constructed adjacent to the Separations Facility. The license that is the subject of this action will, if issued, allow lthe use of the BFRSS separate4 from the operation of the Separations Facility. Impacts resulting from the construction of the BFRSS have already occurred and mitigating measures have been and are being implemented to offset any adverse impacts. Operation of the BFRSS will not interfere with water sources, and should cause no noticeable damage to the terrestrial or aquatic environments. Operating experience at other fuel receiving and storage facilities has shown that radioactive concentrations discharged to the environs (the more significant process effluents) have been well below applicabhle state and federal limits. The small quantities to be released during operation of the BFRSS will result in negligible environmental impact. 20 figs

  2. Impact of varied center volume categories on volume-outcome relationship in children receiving ECMO for heart operations.

    Science.gov (United States)

    Rettiganti, Mallikarjuna; Seib, Paul M; Robertson, Michael J; Wilcox, Andrew; Gupta, Punkaj

    2016-09-01

    To study the volume-outcome relationship among children receiving extracorporeal membrane oxygenation (ECMO), different studies from different databases use different volume categories. The objective of this study was to evaluate if different center volume categories impact the volume-outcome relationship among children receiving ECMO for heart operations. We performed a post hoc analysis of data from an existing national database, the Pediatric Health Information System. Centers were classified into five different volume categories using different cut-offs and different variables. Mortality rates were compared between the varied volume categories using a mixed effects logistic regression model after adjusting for patient- and center-level risk factors. Data collection included demographic information, baseline characteristics, pre-ECMO risk factors, operation details, patient diagnoses, and center data. In unadjusted analysis, there was a significant relationship between center volume and mortality, with low-and medium-volume centers associated with higher mortality rates compared to high-volume centers in all volume categories, except the hierarchical clustering volume category. In contrast, there was no significant association between center-volume and mortality among all volume categories in adjusted analysis. We concluded that high-volume centers were not associated with improved outcomes for the majority of the categorization schemes despite using different cut-offs and different variables for volume categorization.

  3. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  4. Receiver operating characteristic curve analysis of the performance of various radiographic protocols when screening dogs for pulmonary metastases

    International Nuclear Information System (INIS)

    Five radiographic protocols for detecting pulmonary metastases in dogs were compared by analyzing receiver operating characteristic curves for the protocols. Protocols compared were a right lateral view only, a left lateral view only, right lateral and dorsoventral views, both lateral views, and all 3 views. Three radiologists used each of the protocols to evaluate 99 sets of thoracic radiographs. Fifty-two sets of radiographs were from dogs confirmed histologically to have pulmonary metastases and 47 were from dogs proven at necropsy to be free of pulmonary metastases. Results of the 5 protocols were not statistically different. We concluded that a third view is not necessary when routinely screening dogs with cancer for pulmonary metastases and that the standard 2-view thoracic examination should be adequate. However, in individual cases, a third view may be the determining factor in establishing a radiographic diagnosis and should be obtained if any suspicious areas are seen

  5. Approaches for delineating landslide hazard areas using receiver operating characteristic in an advanced calibrating precision soil erosion model

    Directory of Open Access Journals (Sweden)

    P. T. Ghazvinei

    2015-10-01

    Full Text Available Soil erosion is undesirable natural event that causes land degradation and desertification. Identify the erosion-prone areas is a major component of preventive measures. Recent landslide damages at different regions lead us to develop a model of the erosion susceptibility map using empirical method (RUSLE. A landslide-location map was established by interpreting satellite image. Field observation data was used to validate the intensity of soil erosion. Further, a correlation analysis was conducted to investigate the "Receiver Operating Characteristic" and frequency ratio. Results showed a satisfactory correlation between the prepared RUSLE-based soil erosion map and actual landslide distribution. The proposed model can effectively predict the landslide events in soil-erosion area. Such a reliable predictive model is an effective management facility for the regional landslide forecasting system.

  6. Receiver-Operating-Characteristic Analysis Reveals Superiority of Scale-Dependent Wavelet and Spectral Measures for Assessing Cardiac Dysfunction

    CERN Document Server

    Thurner, S; Lowen, S B; Teich, M C; Thurner, Stefan; Feurstein, Markus C.; Lowen, Steven B.; Teich, Malvin C.

    1998-01-01

    Receiver-operating-characteristic (ROC) analysis was used to assess the suitability of various heart rate variability (HRV) measures for correctly classifying electrocardiogram records of varying lengths as normal or revealing the presence of heart failure. Scale-dependent HRV measures were found to be substantially superior to scale-independent measures (scaling exponents) for discriminating the two classes of data over a broad range of record lengths. The wavelet-coefficient standard deviation at a scale near 32 heartbeat intervals, and its spectral counterpart near 1/32 cycles per interval, provide reliable results using record lengths just minutes long. A jittered integrate-and-fire model built around a fractal Gaussian-noise kernel provides a realistic, though not perfect, simulation of heartbeat sequences.

  7. Estimation of doses received by operators in the 1958 RB reactor accident using the MCNP5 computer code simulation

    Directory of Open Access Journals (Sweden)

    Pešić Milan P.

    2012-01-01

    Full Text Available A numerical simulation of the radiological consequences of the RB reactor reactivity excursion accident, which occurred on October 15, 1958, and an estimation of the total doses received by the operators were run by the MCNP5 computer code. The simulation was carried out under the same assumptions as those used in the 1960 IAEA-organized experimental simulation of the accident: total fission energy of 80 MJ released in the accident and the frozen positions of the operators. The time interval of exposure to high doses received by the operators has been estimated. Data on the RB1/1958 reactor core relevant to the accident are given. A short summary of the accident scenario has been updated. A 3-D model of the reactor room and the RB reactor tank, with all the details of the core, created. For dose determination, 3-D simplified, homogenised, sexless and faceless phantoms, placed inside the reactor room, have been developed. The code was run for a number of neutron histories which have given a dose rate uncertainty of less than 2%. For the determination of radiation spectra escaping the reactor core and radiation interaction in the tissue of the phantoms, the MCNP5 code was run (in the KCODE option and “mode n p e”, with a 55-group neutron spectra, 35-group gamma ray spectra and a 10-group electron spectra. The doses were determined by using the conversion of flux density (obtained by the F4 tally in the phantoms to doses using factors taken from ICRP-74 and from the deposited energy of neutrons and gamma rays (obtained by the F6 tally in the phantoms’ tissue. A rough estimation of the time moment when the odour of ozone was sensed by the operators is estimated for the first time and given in Appendix A.1. Calculated total absorbed and equivalent doses are compared to the previously reported ones and an attempt to understand and explain the reasons for the obtained differences has been made. A Root Cause Analysis of the accident was done and

  8. EVALUATION OF SPRING OPERATED RELIEF VALVE MAINTENANCE INTERVALS AND EXTENSION OF MAINTENANCE TIMES USING A WEIBULL ANALYSIS WITH MODIFIED BAYESIAN UPDATING

    Energy Technology Data Exchange (ETDEWEB)

    Harris, S.; Gross, R.; Mitchell, E.

    2011-01-18

    The Savannah River Site (SRS) spring operated pressure relief valve (SORV) maintenance intervals were evaluated using an approach provided by the American Petroleum Institute (API RP 581) for risk-based inspection technology (RBI). In addition, the impact of extending the inspection schedule was evaluated using Monte Carlo Simulation (MCS). The API RP 581 approach is characterized as a Weibull analysis with modified Bayesian updating provided by SRS SORV proof testing experience. Initial Weibull parameter estimates were updated as per SRS's historical proof test records contained in the Center for Chemical Process Safety (CCPS) Process Equipment Reliability Database (PERD). The API RP 581 methodology was used to estimate the SORV's probability of failing on demand (PFD), and the annual expected risk. The API RP 581 methodology indicates that the current SRS maintenance plan is conservative. Cost savings may be attained in certain mild service applications that present low PFD and overall risk. Current practices are reviewed and recommendations are made for extending inspection intervals. The paper gives an illustration of the inspection costs versus the associated risks by using API RP 581 Risk Based Inspection (RBI) Technology. A cost effective maintenance frequency balancing both financial risk and inspection cost is demonstrated.

  9. SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE

    Institute of Scientific and Technical Information of China (English)

    Ming HAN; Yuanyao DING

    2004-01-01

    This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.

  10. ROC [Receiver Operating Characteristics] study of maximum likelihood estimator human brain image reconstructions in PET [Positron Emission Tomography] clinical practice

    International Nuclear Information System (INIS)

    This paper will report on the progress to date in carrying out Receiver Operating Characteristics (ROC) studies comparing Maximum Likelihood Estimator (MLE) and Filtered Backprojection (FBP) reconstructions of normal and abnormal human brain PET data in a clinical setting. A previous statistical study of reconstructions of the Hoffman brain phantom with real data indicated that the pixel-to-pixel standard deviation in feasible MLE images is approximately proportional to the square root of the number of counts in a region, as opposed to a standard deviation which is high and largely independent of the number of counts in FBP. A preliminary ROC study carried out with 10 non-medical observers performing a relatively simple detectability task indicates that, for the majority of observers, lower standard deviation translates itself into a statistically significant detectability advantage in MLE reconstructions. The initial results of ongoing tests with four experienced neurologists/nuclear medicine physicians are presented. Normal cases of 18F -- fluorodeoxyglucose (FDG) cerebral metabolism studies and abnormal cases in which a variety of lesions have been introduced into normal data sets have been evaluated. We report on the results of reading the reconstructions of 90 data sets, each corresponding to a single brain slice. It has become apparent that the design of the study based on reading single brain slices is too insensitive and we propose a variation based on reading three consecutive slices at a time, rating only the center slice. 9 refs., 2 figs., 1 tab

  11. Diagnostic accuracy of serum biochemical fibrosis markers in children with chronic hepatitis B evaluated by receiver operating characteristics analysis

    Institute of Scientific and Technical Information of China (English)

    Dariusz Marek Lebensztejn; El(z)bieta Skiba; Jolanta Tobolczyk; Maria El(z)bieta Sobaniec-Lotowska; Maciej Kaczmarski

    2005-01-01

    AIM: To investigate the diagnostic accuracy of potent serum biochemical fibrosis markers in children with chronic hepatitis B evaluated by receiver operating characteristics (ROC) analysis.METHODS: We determined the serum level of apolipoprotein A-I (APO A-I), haptoglobin (HPT) and a-2macroglobulin (A2M) with an automatic nephelometer in 63 children (age range 4-17 years, mean 10 years)with biopsy-verified chronic HBeAg-positive hepatitis B.Fibrosis stage and inflammation grade were assessed in a blinded fashion according to Batts and Ludwig. We defined mild liver fibrosis as a score ≤2 and advanced fibrosis as a score equal to 3. ROC analysis was used to calculate the power of the assays to detect advanced liver fibrosis (AccuROC, Canada).RESULTS: Serum concentrations of APO A-I, HPT and A2M were not significantly different in patients with chronic hepatitis B compared to controls. However, APO A-I level of 1.19 ng/L had a sensitivity of 85.7% and a specificity of 60.7% (AUC = 0.7117, P = 0.035) to predict advanced fibrosis. All other serum biochemical markers and their combination did not allow a useful prediction.None of these markers was a good predictor of histologic inflammation.CONCLUSION: Apolipoprotein A-I may be a suitable serum marker to predict advanced liver fibrosis in children with chronic hepatitis B.

  12. ROC (Receiver Operating Characteristics) study of maximum likelihood estimator human brain image reconstructions in PET (Positron Emission Tomography) clinical practice

    Energy Technology Data Exchange (ETDEWEB)

    Llacer, J.; Veklerov, E.; Nolan, D. (Lawrence Berkeley Lab., CA (USA)); Grafton, S.T.; Mazziotta, J.C.; Hawkins, R.A.; Hoh, C.K.; Hoffman, E.J. (California Univ., Los Angeles, CA (USA))

    1990-10-01

    This paper will report on the progress to date in carrying out Receiver Operating Characteristics (ROC) studies comparing Maximum Likelihood Estimator (MLE) and Filtered Backprojection (FBP) reconstructions of normal and abnormal human brain PET data in a clinical setting. A previous statistical study of reconstructions of the Hoffman brain phantom with real data indicated that the pixel-to-pixel standard deviation in feasible MLE images is approximately proportional to the square root of the number of counts in a region, as opposed to a standard deviation which is high and largely independent of the number of counts in FBP. A preliminary ROC study carried out with 10 non-medical observers performing a relatively simple detectability task indicates that, for the majority of observers, lower standard deviation translates itself into a statistically significant detectability advantage in MLE reconstructions. The initial results of ongoing tests with four experienced neurologists/nuclear medicine physicians are presented. Normal cases of {sup 18}F -- fluorodeoxyglucose (FDG) cerebral metabolism studies and abnormal cases in which a variety of lesions have been introduced into normal data sets have been evaluated. We report on the results of reading the reconstructions of 90 data sets, each corresponding to a single brain slice. It has become apparent that the design of the study based on reading single brain slices is too insensitive and we propose a variation based on reading three consecutive slices at a time, rating only the center slice. 9 refs., 2 figs., 1 tab.

  13. Receiver-operating characteristic curves for somatic cell scores and California mastitis test in Valle del Belice dairy sheep.

    Science.gov (United States)

    Riggio, Valentina; Pesce, Lorenzo L; Morreale, Salvatore; Portolano, Baldassare

    2013-06-01

    Using receiver-operating characteristic (ROC) curve methodology this study was designed to assess the diagnostic effectiveness of somatic cell count (SCC) and the California mastitis test (CMT) in Valle del Belice sheep, and to propose and evaluate threshold values for those tests that would optimally discriminate between healthy and infected udders. Milk samples (n=1357) were collected from 684 sheep in four flocks. The prevalence of infection, as determined by positive bacterial culture was 0.36, 87.7% of which were minor and 12.3% major pathogens. Of the culture negative samples, 83.7% had an SCCCMT results were evaluated, the estimated area under the ROC curve was greater for glands infected with major compared to minor pathogens (0.88 vs. 0.73), whereas the area under the curve considering all pathogens was similar to the one for minor pathogens (0.75). The estimated optimal thresholds were 3.00 (CMT), 2.81 (SCS for the whole sample), 2.81 (SCS for minor pathogens), and 3.33 (SCS for major pathogens). These correctly classified, respectively, 69.0%, 73.5%, 72.6% and 91.0% of infected udders in the samples. The CMT appeared only to discriminate udders infected with major pathogens. In this population, SCS appeared to be the best indirect test of the bacteriological status of the udder. PMID:23317658

  14. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2016-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  15. Bayesian Agglomerative Clustering with Coalescents

    OpenAIRE

    Teh, Yee Whye; Daumé III, Hal; Roy, Daniel

    2009-01-01

    We introduce a new Bayesian model for hierarchical clustering based on a prior over trees called Kingman's coalescent. We develop novel greedy and sequential Monte Carlo inferences which operate in a bottom-up agglomerative fashion. We show experimentally the superiority of our algorithms over others, and demonstrate our approach in document clustering and phylolinguistics.

  16. Carcinoembryonic antigen (CEA) level, CEA ratio, and treatment outcome of rectal cancer patients receiving pre-operative chemoradiation and surgery

    International Nuclear Information System (INIS)

    To investigate serum carcinoembryonic antigen (CEA) as a prognostic factor for rectal cancer patients receiving pre-operative chemoradiotherapy (CRT). Between 2000 and 2009, 138 patients with advanced rectal cancer receiving CRT before surgery at our hospital were retrospectively classified into 3 groups: pre-CRT CEA <6 ng/ml (group L; n = 87); pre-CRT CEA ≥ 6 ng/ml and post-CRT CEA <6 ng/ml (group H-L; n = 32); and both pre- and post-CRT CEA ≥ 6 ng/ml (group H-H; n = 19). CEA ratio (defined as post-CRT CEA divided by pre-CRT CEA), post-CRT CEA level and other factors were reviewed for prediction of pathologic complete response (pCR). Five-year disease-free survival (DFS) was better in groups L (69.0%) and H-L (74.5%) than in group H-H (44.9%) (p = 0.024). Pathologic complete response was observed in 19.5%, 21.9% and 5.3% of groups L, H-L and H-H respectively (p = 0.281). Multivariate analysis showed that ypN stage and pCR were independent prognostic factors for DFS and that post-CRT CEA level was independently predictive of pCR. As a whole, post-CRT CEA <2.61 ng/ml predicted pCR (sensitivity 76.0%; specificity 58.4%). For those with pre-CRT CEA ≥6 ng/ml, post-CRT CEA and CEA ratio both predicted pCR (sensitivity 87.5%, specificity 76.7%). In patients with pre-CRT serum CEA ≥6 ng/ml, those with “normalized” CEA levels after CRT may have similar DFS to those with “normal” (<6 ng/ml) pre-CRT values. Post-CRT CEA level is a predictor for pCR, especially in those with pre-CRT CEA ≥6 ng/ml

  17. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  18. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  19. Bayesian Methods and Universal Darwinism

    CERN Document Server

    Campbell, John

    2010-01-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...

  20. Summary receiver operating characteristics (SROC) and hierarchical SROC models for analysis of diagnostic test evaluations of antibody ELISAs for paratuberculosis.

    Science.gov (United States)

    Toft, Nils; Nielsen, Søren S

    2009-11-15

    Critical, systematic reviews of available diagnostic test evaluations are a meticulous approach to synthesize evidence about a diagnostic test. However, often the review finds that data quality is poor due to deficiencies in design and reporting of the test evaluations and formal statistical comparisons are discouraged. Even when only simple summary measures are appropriate, the strong correlation between sensitivity and specificity and their dependence on differences in diagnostic threshold across studies, creates the need for tools to summarise properties of the diagnostic test under investigation. This study presents summary receiver operating characteristics (SROC) analysis as a means to synthesize information from diagnostic test evaluation studies. Using data from a review of diagnostic tests for ante mortem diagnosis of paratuberculosis as an illustration, SROC and hierarchical SROC (HSROC) analysis were used to estimate overall diagnostic accuracies of antibody ELISAs for bovine paratuberculosis while accounting for covariates: the target condition (infectious or infected) used in the test evaluation (one for the evaluation of Se and one for Sp); and the type of test (serum vs. milk). The methods gave comparable results (regarding the estimated diagnostic log odds ratio), considering the small sample size and the quality of data. The SROC analysis found a difference in the performance of tests when the target condition for evaluation of Se was infected rather than infectious, suggesting that ELISAs are not suitable for detecting infected cattle. However, the SROC model does not take differences in sample size between study units into account, whereas the HSROC allows for both between and within study variation. Considering the small sample size, more credibility should be given to the results of the HSROC. For both methods the area under the (H)SROC curve was calculated and results were comparable. The conclusion is that while the SROC is simpler and easier

  1. Comparison of the performance of two measures of central adiposity among apparently healthy Nigerians using the receiver operating characteristic analysis

    Directory of Open Access Journals (Sweden)

    Christian Ifedili Okafor

    2011-01-01

    Full Text Available Objective: To compare the performance of waist circumference (WC and waist-to-hip ratio (WHR in predicting the presence of cardiovascular risk factors (hypertension and generalized obesity in an apparently healthy population. Materials and Methods: We recruited 898 apparently healthy subjects (318 males and 580 females of the Igbo ethnic group resident in Enugu (urban, Southeast Nigeria. Data collection was done using the World Health Organization Stepwise approach to Surveillance of risk factors (STEPS instrument. Subjects had their weight, height, waist and hip circumferences, systolic and diastolic blood pressures measured according to the guidelines in the step 2 of STEPS instrument. Generalized obesity and hypertension were defined using body mass index (BMI and JNC 7 classifications, respectively. Quantitative and qualitative variables were analyzed using t-test and Chi-square analysis, respectively, while the performance of WC and WHR was compared using the Receiver Operating Characteristic (ROC analysis. P value was set at <0.05. Results: The mean age of the subjects was 48.7 (12.9 years. Central obesity was found in 76.9% and 66.5% of subjects using WHR and WC, respectively. WC had a significantly higher area under the curve (AUC than WHR in all the cardiovascular risk groups, namely, generalized obesity (AUC = 0.88 vs. 0.62, hypertension alone (AUC = 0.60 vs. 0.53, and both generalized obesity and hypertension (AUC = 0.86 vs. 0.57. Conclusion: WC performed better than WHR in predicting the presence of cardiovascular risk factors. Being a simple index, it can easily be measured in routine clinic settings without the need for calculations or use of cumbersome techniques.

  2. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  3. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  4. Bayesian Games with Intentions

    OpenAIRE

    Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael

    2016-01-01

    We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.

  5. Machine learning-based receiver operating characteristic (ROC) curves for crisp and fuzzy classification of DNA microarrays in cancer research.

    Science.gov (United States)

    Peterson, Leif E; Coleman, Matthew A

    2008-01-01

    Receiver operating characteristic (ROC) curves were generated to obtain classification area under the curve (AUC) as a function of feature standardization, fuzzification, and sample size from nine large sets of cancer-related DNA microarrays. Classifiers used included k nearest neighbor (kNN), näive Bayes classifier (NBC), linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), learning vector quantization (LVQ1), logistic regression (LOG), polytomous logistic regression (PLOG), artificial neural networks (ANN), particle swarm optimization (PSO), constricted particle swarm optimization (CPSO), kernel regression (RBF), radial basis function networks (RBFN), gradient descent support vector machines (SVMGD), and least squares support vector machines (SVMLS). For each data set, AUC was determined for a number of combinations of sample size, total sum[-log(p)] of feature t-tests, with and without feature standardization and with (fuzzy) and without (crisp) fuzzification of features. Altogether, a total of 2,123,530 classification runs were made. At the greatest level of sample size, ANN resulted in a fitted AUC of 90%, while PSO resulted in the lowest fitted AUC of 72.1%. AUC values derived from 4NN were the most dependent on sample size, while PSO was the least. ANN depended the most on total statistical significance of features used based on sum[-log(p)], whereas PSO was the least dependent. Standardization of features increased AUC by 8.1% for PSO and -0.2% for QDA, while fuzzification increased AUC by 9.4% for PSO and reduced AUC by 3.8% for QDA. AUC determination in planned microarray experiments without standardization and fuzzification of features will benefit the most if CPSO is used for lower levels of feature significance (i.e., sum[-log(p)] ~ 50) and ANN is used for greater levels of significance (i.e., sum[-log(p)] ~ 500). When only standardization of features is performed, studies are likely to benefit most by using CPSO for low levels

  6. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...... and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...... is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  7. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  8. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  9. Dynamic Bayesian diffusion estimation

    CERN Document Server

    Dedecius, K

    2012-01-01

    The rapidly increasing complexity of (mainly wireless) ad-hoc networks stresses the need of reliable distributed estimation of several variables of interest. The widely used centralized approach, in which the network nodes communicate their data with a single specialized point, suffers from high communication overheads and represents a potentially dangerous concept with a single point of failure needing special treatment. This paper's aim is to contribute to another quite recent method called diffusion estimation. By decentralizing the operating environment, the network nodes communicate just within a close neighbourhood. We adopt the Bayesian framework to modelling and estimation, which, unlike the traditional approaches, abstracts from a particular model case. This leads to a very scalable and universal method, applicable to a wide class of different models. A particularly interesting case - the Gaussian regressive model - is derived as an example.

  10. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    Science.gov (United States)

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…

  11. New operating strategies for molten salt in line focusing solar fields - Daily drainage and solar receiver preheating

    Science.gov (United States)

    Eickhoff, Martin; Meyer-Grünefeldt, Mirko; Keller, Lothar

    2016-05-01

    Nowadays molten salt is efficiently used in point concentrating solar thermal power plants. Line focusing systems still have the disadvantage of elevated heat losses at night because of active freeze protection of the solar field piping system. In order to achieve an efficient operation of line focusing solar power plants using molten salt, a new plant design and a novel operating strategy is developed for Linear Fresnel- and Parabolic Trough power plants. Daily vespertine drainage of the solar field piping and daily matutinal refilling of the solar preheated absorber tubes eliminate the need of nocturnal heating of the solar field and reduce nocturnal heat losses to a minimum. The feasibility of this new operating strategy with all its sub-steps has been demonstrated experimentally.

  12. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  13. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  14. Receiver operating characteristic (ROC) curve analysis of the tumour markers CEA, CA 50 and CA 242 in pancreatic cancer; results from a prospective study.

    OpenAIRE

    Pasanen, P. A.; Eskelinen, M.; Partanen, K.; Pikkarainen, P; Penttilä, I.; Alhava, E

    1993-01-01

    The serum values of the tumour markers carcinoembryonic antigen (CEA), cancer-associated carboanhydrate antigens CA 50 and CA 242 were evaluated in 193 patients with hepatopancreato-biliary diseases by receiver operating characteristic (ROC) curve analysis in order to compare their diagnostic accuracy in pancreatic cancer (n = 26), and to define optimal cut-off levels for the serum values of these tumour markers in the diagnosis of pancreatic cancer. The ROC analysis showed that all marker te...

  15. From humans to rats and back again: Bridging the divide between human and animal studies of recognition memory with receiver operating characteristics

    OpenAIRE

    Koen, Joshua D.; Yonelinas, Andrew P.

    2011-01-01

    Receiver operating characteristics (ROCs) have been used extensively to study the processes underlying human recognition memory, and this method has recently been applied in studies of rats. However, the extent to which the results from human and animal studies converge is neither entirely clear, nor is it known how the different methods used to obtain ROCs in different species impact the results. A recent study used a response bias ROC manipulation with rats and demonstrated that speeding me...

  16. On Fuzzy Bayesian Inference

    OpenAIRE

    Frühwirth-Schnatter, Sylvia

    1990-01-01

    In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)

  17. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  18. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  19. Coupled optical/thermal/fluid analysis and design requirements for operation and testing of a supercritical CO2 solar receiver.

    Energy Technology Data Exchange (ETDEWEB)

    Khivsara, Sagar [Indian Institute of Science, Bangalor (India)

    2015-01-01

    Recent studies have evaluated closed-loop supercritical carbon dioxide (s-CO2) Brayton cycles to be a higher energy-density system in comparison to conventional superheated steam Rankine systems. At turbine inlet conditions of 923K and 25 MPa, high thermal efficiency (~50%) can be achieved. Achieving these high efficiencies will make concentrating solar power (CSP) technologies a competitive alternative to current power generation methods. To incorporate a s-CO2 Brayton power cycle in a solar power tower system, the development of a solar receiver capable of providing an outlet temperature of 923 K (at 25 MPa) is necessary. To satisfy the temperature requirements of a s-CO2 Brayton cycle with recuperation and recompression, it is required to heat s-CO2 by a temperature of ~200 K as it passes through the solar receiver. Our objective was to develop an optical-thermal-fluid model to design and evaluate a tubular receiver that will receive a heat input ~1 MWth from a heliostat field. We also undertook the documentation of design requirements for the development, testing and safe operation of a direct s-CO2 solar receiver. The main purpose of this document is to serve as a reference and guideline for design and testing requirements, as well as to address the technical challenges and provide initial parameters for the computational models that will be employed for the development of s-CO2 receivers.

  20. Comparison of a Bayesian Network with a Logistic Regression Model to Forecast IgA Nephropathy

    Directory of Open Access Journals (Sweden)

    Michel Ducher

    2013-01-01

    Full Text Available Models are increasingly used in clinical practice to improve the accuracy of diagnosis. The aim of our work was to compare a Bayesian network to logistic regression to forecast IgA nephropathy (IgAN from simple clinical and biological criteria. Retrospectively, we pooled the results of all biopsies (n=155 performed by nephrologists in a specialist clinical facility between 2002 and 2009. Two groups were constituted at random. The first subgroup was used to determine the parameters of the models adjusted to data by logistic regression or Bayesian network, and the second was used to compare the performances of the models using receiver operating characteristics (ROC curves. IgAN was found (on pathology in 44 patients. Areas under the ROC curves provided by both methods were highly significant but not different from each other. Based on the highest Youden indices, sensitivity reached (100% versus 67% and specificity (73% versus 95% using the Bayesian network and logistic regression, respectively. A Bayesian network is at least as efficient as logistic regression to estimate the probability of a patient suffering IgAN, using simple clinical and biological data obtained during consultation.

  1. Comparison of a Bayesian network with a logistic regression model to forecast IgA nephropathy.

    Science.gov (United States)

    Ducher, Michel; Kalbacher, Emilie; Combarnous, François; Finaz de Vilaine, Jérome; McGregor, Brigitte; Fouque, Denis; Fauvel, Jean Pierre

    2013-01-01

    Models are increasingly used in clinical practice to improve the accuracy of diagnosis. The aim of our work was to compare a Bayesian network to logistic regression to forecast IgA nephropathy (IgAN) from simple clinical and biological criteria. Retrospectively, we pooled the results of all biopsies (n = 155) performed by nephrologists in a specialist clinical facility between 2002 and 2009. Two groups were constituted at random. The first subgroup was used to determine the parameters of the models adjusted to data by logistic regression or Bayesian network, and the second was used to compare the performances of the models using receiver operating characteristics (ROC) curves. IgAN was found (on pathology) in 44 patients. Areas under the ROC curves provided by both methods were highly significant but not different from each other. Based on the highest Youden indices, sensitivity reached (100% versus 67%) and specificity (73% versus 95%) using the Bayesian network and logistic regression, respectively. A Bayesian network is at least as efficient as logistic regression to estimate the probability of a patient suffering IgAN, using simple clinical and biological data obtained during consultation.

  2. Practical Bayesian Tomography

    CERN Document Server

    Granade, Christopher; Cory, D G

    2015-01-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  3. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  4. Bayesian Lensing Shear Measurement

    CERN Document Server

    Bernstein, Gary M

    2013-01-01

    We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...

  5. Impact of interference on the receiving systems of the Deep-Space Network (DSN) Earth stations operated by NASA due to adjacent band emissions from Earth exploration satellites operating in the 8025-

    Science.gov (United States)

    Wang, Charles C.; Sue, Miles K.; Manshadi, Farzin; Kinman, Peter

    2005-01-01

    This paper will first describe the characteristics of interference from a typical EESS satellite, including the intensity, frequency and duration of such interference. The paper will then discuss the DSN interference susceptibility, including the various components in the receiving systems that are susceptible to interference and the recovery time after a strong interference. Finally, the paper will discuss the impact of interference on science data and missions operations.

  6. Malicious Bayesian Congestion Games

    CERN Document Server

    Gairing, Martin

    2008-01-01

    In this paper, we introduce malicious Bayesian congestion games as an extension to congestion games where players might act in a malicious way. In such a game each player has two types. Either the player is a rational player seeking to minimize her own delay, or - with a certain probability - the player is malicious in which case her only goal is to disturb the other players as much as possible. We show that such games do in general not possess a Bayesian Nash equilibrium in pure strategies (i.e. a pure Bayesian Nash equilibrium). Moreover, given a game, we show that it is NP-complete to decide whether it admits a pure Bayesian Nash equilibrium. This result even holds when resource latency functions are linear, each player is malicious with the same probability, and all strategy sets consist of singleton sets. For a slightly more restricted class of malicious Bayesian congestion games, we provide easy checkable properties that are necessary and sufficient for the existence of a pure Bayesian Nash equilibrium....

  7. Bayesian Methods and Universal Darwinism

    Science.gov (United States)

    Campbell, John

    2009-12-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.

  8. Universal Darwinism As a Process of Bayesian Inference.

    Science.gov (United States)

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature. PMID:27375438

  9. Universal Darwinism As a Process of Bayesian Inference.

    Science.gov (United States)

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  10. Universal Darwinism as a process of Bayesian inference

    Directory of Open Access Journals (Sweden)

    John Oberon Campbell

    2016-06-01

    Full Text Available Many of the mathematical frameworks describing natural selection are equivalent to Bayes’ Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians. As Bayesian inference can always be cast in terms of (variational free energy minimization, natural selection can be viewed as comprising two components: a generative model of an ‘experiment’ in the external world environment, and the results of that 'experiment' or the 'surprise' entailed by predicted and actual outcomes of the ‘experiment’. Minimization of free energy implies that the implicit measure of 'surprise' experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  11. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  12. Usefulness of pinhole collimator in differential diagnosis of metastatic disease and degenerative joint disease in the vertebrae; Evaluation by receiver operating characteristics (ROC) analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kosuda, Shigeru; Kawahara, Syunji; Ishibashi, Akihiko; Tamura, Kohei; Tsukatani, Yasushi; Fujii, Hiroshi (Okura National Hospital, Tokyo (Japan)); Kubo, Atsushi; Hashimoto, Shozo

    1989-11-01

    In order to evaluate the diagnostic efficacy of pinhole collimator (PHC) imaging combined with an X-ray for vertebral metastasis, our prospective study has employed receiver operating characteristics (ROC) analysis in 21 patients, 11 with osseous metastasis and 15 with degenerative joint disease in the lumbar vertebrae. PHC imaging provided better anatomic information on the extent of {sup 99m}Tc-MDP accumulation. PHC vertebral scintigraphy had a considerable impact on the decision-making process, although with variations and not very satisfactory results among the physicians with little experience. Our study suggests that PHC imaging and X-ray film are useful in differentiating between osseous metastasis and degenerative joint disease in the vertebra. (author).

  13. Receiver-operating characteristic curves and likelihood ratios: improvements over traditional methods for the evaluation and application of veterinary clinical pathology tests

    DEFF Research Database (Denmark)

    Gardner, Ian A.; Greiner, Matthias

    2006-01-01

    Receiver-operating characteristic (ROC) curves provide a cutoff-independent method for the evaluation of continuous or ordinal tests used in clinical pathology laboratories. The area under the curve is a useful overall measure of test accuracy and can be used to compare different tests (or...... different equipment) used by the same tester, as well as the accuracy of different diagnosticians that use the same test material. To date, ROC analysis has not been widely used in veterinary clinical pathology studies, although it should be considered a useful complement to estimates of sensitivity...... and specificity in test evaluation studies. In addition, calculation of likelihood ratios can potentially improve the clinical utility of such studies because likelihood ratios provide an indication of how the post-test probability changes as a function of the magnitude of the test results. For ordinal test...

  14. Diagnostic sensitivity of serum carcinoembryonic antigen, carbohydrate antigen 19-9, alpha-fetoprotein, and beta-human chorionic gonadotropin in esophageal carcinoma (receiver operating characteristic curve analysis

    Directory of Open Access Journals (Sweden)

    Bhawna Bagaria

    2015-01-01

    Full Text Available Background: Esophageal carcinomas are very lethal disease relatively unresponsive to therapy. The continued development of new and more effective chemotherapeutic agents and regimens offers hope that in the future, this carcinoma may be amenable to either more effective palliative treatment or possibly increased cure. We, therefore, aimed to evaluate the marker with best diagnostic sensitivity in esophageal carcinoma. Materials and Methods: Serum carcinoembryonic antigen (CEA, carbohydrate antigen 19-9 (CA19-9, alpha-fetoprotein (AFP, and beta-human chorionic gonadotropin (β-HCG levels were assessed in healthy subjects (n = 50 and patients (n = 50 initially diagnosed of esophageal carcinoma by endoscopic examination and biopsy before receiving any therapy. The data were analyzed using SPSS software version 10.0 (SPSS Inc. USA and MedCalc to estimate mean ± standard deviation, the significance of the observed differences (P value, for calculating sensitivity and for plotting receiver operating characteristic curves. Results: Sensitivity of CEA, CA19-9, AFP, and β-HCG detected in esophagus cancer was 38%, 18%, 10%, and 26% respectively. Conclusion: From the above studied markers, CEA has the highest sensitivity followed by β-HCG, CA19-9 and AFP. Although the sensitivity of tumor markers in esophagus cancer is low, they may be useful additional parameter in the prediction of neoplasms involved at the early stage of tumor growth.

  15. Radio receivers

    Science.gov (United States)

    Bankov, V. N.; Barulin, L. G.; Zhodzishskii, M. I.; Malyshev, I. V.; Petrusinskii, V. V.

    The book is concerned with the design of microelectronic radio receivers and their components based on semiconductor and hybrid integrated circuits. Topics discussed include the hierarchical structure of radio receivers, the synthesis of structural schemes, the design of the principal functional units, and the design of radio receiver systems with digital signal processing. The discussion also covers the integrated circuits of multifunctional amplifiers, analog multipliers, charge-transfer devices, frequency filters, piezoelectronic devices, and microwave amplifiers, filters, and mixers.

  16. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Science.gov (United States)

    Jones, Matt; Love, Bradley C

    2011-08-01

    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  17. Bayesian variable selection for detecting adaptive genomic differences among populations.

    Science.gov (United States)

    Riebler, Andrea; Held, Leonhard; Stephan, Wolfgang

    2008-03-01

    We extend an F(st)-based Bayesian hierarchical model, implemented via Markov chain Monte Carlo, for the detection of loci that might be subject to positive selection. This model divides the F(st)-influencing factors into locus-specific effects, population-specific effects, and effects that are specific for the locus in combination with the population. We introduce a Bayesian auxiliary variable for each locus effect to automatically select nonneutral locus effects. As a by-product, the efficiency of the original approach is improved by using a reparameterization of the model. The statistical power of the extended algorithm is assessed with simulated data sets from a Wright-Fisher model with migration. We find that the inclusion of model selection suggests a clear improvement in discrimination as measured by the area under the receiver operating characteristic (ROC) curve. Additionally, we illustrate and discuss the quality of the newly developed method on the basis of an allozyme data set of the fruit fly Drosophila melanogaster and a sequence data set of the wild tomato Solanum chilense. For data sets with small sample sizes, high mutation rates, and/or long sequences, however, methods based on nucleotide statistics should be preferred. PMID:18245358

  18. DPpackage: Bayesian Semi- and Nonparametric Modeling in R

    Directory of Open Access Journals (Sweden)

    Alejandro Jara

    2011-04-01

    Full Text Available Data analysis sometimes requires the relaxation of parametric assumptions in order to gain modeling flexibility and robustness against mis-specification of the probability model. In the Bayesian context, this is accomplished by placing a prior distribution on a function space, such as the space of all probability distributions or the space of all regression functions. Unfortunately, posterior distributions ranging over function spaces are highly complex and hence sampling methods play a key role. This paper provides an introduction to a simple, yet comprehensive, set of programs for the implementation of some Bayesian nonparametric and semiparametric models in R, DPpackage. Currently, DPpackage includes models for marginal and conditional density estimation, receiver operating characteristic curve analysis, interval-censored data, binary regression data, item response data, longitudinal and clustered data using generalized linear mixed models, and regression data using generalized additive models. The package also contains functions to compute pseudo-Bayes factors for model comparison and for eliciting the precision parameter of the Dirichlet process prior, and a general purpose Metropolis sampling algorithm. To maximize computational efficiency, the actual sampling for each model is carried out using compiled C, C++ or Fortran code.

  19. Bayesian least squares deconvolution

    CERN Document Server

    Ramos, A Asensio

    2015-01-01

    Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  20. Bayesian least squares deconvolution

    Science.gov (United States)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  1. Hybrid Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2012-01-01

    Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...

  2. Bayesian Adaptive Exploration

    CERN Document Server

    Loredo, T J

    2004-01-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...

  3. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  4. Bayesian multiple target tracking

    CERN Document Server

    Streit, Roy L

    2013-01-01

    This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements

  5. Bayesian and frequentist inequality tests

    OpenAIRE

    David M. Kaplan; Zhuo, Longhao

    2016-01-01

    Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...

  6. Accelerated aging tests on ENEA-ASE solar coating for receiver tube suitable to operate up to 550 °C

    Science.gov (United States)

    Antonaia, A.; D'Angelo, A.; Esposito, S.; Addonizio, M. L.; Castaldo, A.; Ferrara, M.; Guglielmo, A.; Maccari, A.

    2016-05-01

    A patented solar coating for evacuated receiver, based on innovative graded WN-AlN cermet layer, has been optically designed and optimized to operate at high temperature with high performance and high thermal stability. This solar coating, being designed to operate in solar field with molten salt as heat transfer fluid, has to be thermally stable up to the maximum temperature of 550 °C. With the aim of determining degradation behaviour and lifetime prediction of the solar coating, we chose to monitor the variation of the solar absorptance αs after each thermal annealing cycle carried out at accelerated temperatures under vacuum. This prediction method was coupled with a preliminary Differential Thermal Analysis (DTA) in order to give evidence for any chemical-physical coating modification in the temperature range of interest before performing accelerated aging tests. In the accelerated aging tests we assumed that the temperature dependence of the degradation processes could be described by Arrhenius behaviour and we hypothesized that a linear correlation occurs between optical parameter variation rate (specifically, Δαs/Δt) and degradation process rate. Starting from Δαs/Δt values evaluated at 650 and 690 °C, Arrhenius plot gave an activation energy of 325 kJ mol-1 for the degradation phenomenon, where the prediction on the coating degradation gave a solar absorptance decrease of only 1.65 % after 25 years at 550 °C. This very low αs decrease gave evidence for an excellent stability of our solar coating, also when employed at the maximum temperature (550 °C) of a solar field operating with molten salt as heat transfer fluid.

  7. Bayesian Dark Knowledge

    NARCIS (Netherlands)

    A. Korattikara; V. Rathod; K. Murphy; M. Welling

    2015-01-01

    We consider the problem of Bayesian parameter estimation for deep neural networks, which is important in problem settings where we may have little data, and/ or where we need accurate posterior predictive densities p(y|x, D), e.g., for applications involving bandits or active learning. One simple ap

  8. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  9. Bayesian Adaptive Exploration

    Science.gov (United States)

    Loredo, Thomas J.

    2004-04-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.

  10. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...

  11. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  12. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...

  13. [Comparison of LCD and CRT monitors for detection of pulmonary nodules and interstitial lung diseases on digital chest radiographs by using receiver operating characteristic analysis].

    Science.gov (United States)

    Ikeda, Ryuji; Katsuragawa, Shigehiko; Shimonobou, Toshiaki; Hiai, Yasuhiro; Hashida, Masahiro; Awai, Kazuo; Yamashita, Yasuyuki; Doi, Kunio

    2006-05-20

    Soft copy reading of digital images has been practiced commonly in the PACS environment. In this study, we compared liquid-crystal display (LCD) and cathode-ray tube (CRT) monitors for detection of pulmonary nodules and interstitial lung diseases on digital chest radiographs by using receiver operating characteristic (ROC) analysis. Digital chest images with a 1000x1000 matrix size and a 8 bit grayscale were displayed on LCD/CRT monitor with 2M pixels in each observer test. Eight and ten radiologists participated in the observer tests for detection of nodules and interstitial diseases, respectively. In each observer test, radiologists marked their confidence levels for diagnosis of pulmonary nodules or interstitial diseases. The detection performance of radiologists was evaluated by ROC analyses. The average Az values (area under the ROC curve) in detecting pulmonary nodules with LCD and CRT monitors were 0.792 and 0.814, respectively. In addition, the average Az values in detecting interstitial diseases with LCD and CRT monitors were 0.951 and 0.953, respectively. There was no statistically significant difference between LCD and CRT for both detection of pulmonary nodules (P=0.522) and interstitial lung diseases (P=0.869). Therefore, we believe that the LCD monitor instead of the CRT monitor can be used for the diagnosis of pulmonary nodules and interstitial lung diseases in digital chest images.

  14. Prediction of Abdominal Visceral Obesity From Body Mass Index,Waist Circumference and Waist-hip Ratio in Chinese Adults:Receiver Operating Characteristic Curves Analysis

    Institute of Scientific and Technical Information of China (English)

    WEI-PING JIA; JUN-XI LU; KUN-SAN XIANG; YU-QIAN BAO; HUI-JUAN LU; LEI CHEN

    2003-01-01

    To evaluate the sensitivity and specificity of body mass index (BMI), waist circumference (WC) and waist-to-hip ratio (WHR) measurements in diagnosing abdominal visceral obesity. Methods BMI, WC, and WHR were assessed in 690 Chinese adults (305 men and 385women) and compared with magnetic resonance imaging (MRI) measurements of abdominal visceral adipose tissue (VA). Receiver operating characteristic (ROC) curves were generated and used to determine the threshold point for each anthropometric parameter. Results 1) MRI showed that 61.7% of overweight/obese individuals (BMI≥25 kg/m2) and 14.2% of normal weight (BMI<25kg/m2) individuals had abdominal visceral obesity (VA≥ 100 cm2). 2) VA was positively correlated with each anthropometric variable, of which WC showed the highest correlation (r=0.73-0.77,P<0.001 ). 3) The best cut-off points for assessing abdominal visceral obesity were as followed: BMI of 26 kg/m2, WC of 90 cm, and WHR of 0.93, with WC being the most sensitive and specific factor. 4)Among subjects with BMI≥28 kg/m2 or WC≥95 cm, 95% of men and 90% of women appeared to have abdominal visceral obesity. Conclusion Measurements of BMI, WC, and WHR can be used in the prediction of abdominal visceral obesity, of which WC was the one with better accuracy.

  15. Disadvantages of using the area under the receiver operating characteristic curve to assess imaging tests: A discussion and proposal for an alternative approach

    Energy Technology Data Exchange (ETDEWEB)

    Halligan, Steve [University College London, Centre for Medical Imaging, University College Hospital, London (United Kingdom); Altman, Douglas G. [University of Oxford, Centre for Statistics in Medicine, Oxford (United Kingdom); Mallett, Susan [University of Oxford, Department of Primary Care Health Sciences, Oxford (United Kingdom)

    2015-04-01

    The objectives are to describe the disadvantages of the area under the receiver operating characteristic curve (ROC AUC) to measure diagnostic test performance and to propose an alternative based on net benefit. We use a narrative review supplemented by data from a study of computer-assisted detection for CT colonography. We identified problems with ROC AUC. Confidence scoring by readers was highly non-normal, and score distribution was bimodal. Consequently, ROC curves were highly extrapolated with AUC mostly dependent on areas without patient data. AUC depended on the method used for curve fitting. ROC AUC does not account for prevalence or different misclassification costs arising from false-negative and false-positive diagnoses. Change in ROC AUC has little direct clinical meaning for clinicians. An alternative analysis based on net benefit is proposed, based on the change in sensitivity and specificity at clinically relevant thresholds. Net benefit incorporates estimates of prevalence and misclassification costs, and it is clinically interpretable since it reflects changes in correct and incorrect diagnoses when a new diagnostic test is introduced. ROC AUC is most useful in the early stages of test assessment whereas methods based on net benefit are more useful to assess radiological tests where the clinical context is known. Net benefit is more useful for assessing clinical impact. (orig.)

  16. Perspective Biological Markers for Autism Spectrum Disorders: Advantages of the Use of Receiver Operating Characteristic Curves in Evaluating Marker Sensitivity and Specificity

    Directory of Open Access Journals (Sweden)

    Provvidenza M. Abruzzo

    2015-01-01

    Full Text Available Autism Spectrum Disorders (ASD are a heterogeneous group of neurodevelopmental disorders. Recognized causes of ASD include genetic factors, metabolic diseases, toxic and environmental factors, and a combination of these. Available tests fail to recognize genetic abnormalities in about 70% of ASD children, where diagnosis is solely based on behavioral signs and symptoms, which are difficult to evaluate in very young children. Although it is advisable that specific psychotherapeutic and pedagogic interventions are initiated as early as possible, early diagnosis is hampered by the lack of nongenetic specific biological markers. In the past ten years, the scientific literature has reported dozens of neurophysiological and biochemical alterations in ASD children; however no real biomarker has emerged. Such literature is here reviewed in the light of Receiver Operating Characteristic (ROC analysis, a very valuable statistical tool, which evaluates the sensitivity and the specificity of biomarkers to be used in diagnostic decision making. We also apply ROC analysis to some of our previously published data and discuss the increased diagnostic value of combining more variables in one ROC curve analysis. We also discuss the use of biomarkers as a tool for advancing our understanding of nonsyndromic ASD.

  17. Disadvantages of using the area under the receiver operating characteristic curve to assess imaging tests: A discussion and proposal for an alternative approach

    International Nuclear Information System (INIS)

    The objectives are to describe the disadvantages of the area under the receiver operating characteristic curve (ROC AUC) to measure diagnostic test performance and to propose an alternative based on net benefit. We use a narrative review supplemented by data from a study of computer-assisted detection for CT colonography. We identified problems with ROC AUC. Confidence scoring by readers was highly non-normal, and score distribution was bimodal. Consequently, ROC curves were highly extrapolated with AUC mostly dependent on areas without patient data. AUC depended on the method used for curve fitting. ROC AUC does not account for prevalence or different misclassification costs arising from false-negative and false-positive diagnoses. Change in ROC AUC has little direct clinical meaning for clinicians. An alternative analysis based on net benefit is proposed, based on the change in sensitivity and specificity at clinically relevant thresholds. Net benefit incorporates estimates of prevalence and misclassification costs, and it is clinically interpretable since it reflects changes in correct and incorrect diagnoses when a new diagnostic test is introduced. ROC AUC is most useful in the early stages of test assessment whereas methods based on net benefit are more useful to assess radiological tests where the clinical context is known. Net benefit is more useful for assessing clinical impact. (orig.)

  18. Quantum Bayesianism at the Perimeter

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.

  19. Narrowband interference parameterization for sparse Bayesian recovery

    KAUST Repository

    Ali, Anum

    2015-09-11

    This paper addresses the problem of narrowband interference (NBI) in SC-FDMA systems by using tools from compressed sensing and stochastic geometry. The proposed NBI cancellation scheme exploits the frequency domain sparsity of the unknown signal and adopts a Bayesian sparse recovery procedure. This is done by keeping a few randomly chosen sub-carriers data free to sense the NBI signal at the receiver. As Bayesian recovery requires knowledge of some NBI parameters (i.e., mean, variance and sparsity rate), we use tools from stochastic geometry to obtain analytical expressions for the required parameters. Our simulation results validate the analysis and depict suitability of the proposed recovery method for NBI mitigation. © 2015 IEEE.

  20. Bayesian Methods for Analyzing Structural Equation Models with Covariates, Interaction, and Quadratic Latent Variables

    Science.gov (United States)

    Lee, Sik-Yum; Song, Xin-Yuan; Tang, Nian-Sheng

    2007-01-01

    The analysis of interaction among latent variables has received much attention. This article introduces a Bayesian approach to analyze a general structural equation model that accommodates the general nonlinear terms of latent variables and covariates. This approach produces a Bayesian estimate that has the same statistical optimal properties as a…

  1. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test.

    Science.gov (United States)

    Wang, C; Turnbull, B W; Nielsen, S S; Gröhn, Y T

    2011-05-01

    A Bayesian methodology was developed based on a latent change-point model to evaluate the performance of milk ELISA and fecal culture tests for longitudinal Johne's disease diagnostic data. The situation of no perfect reference test was considered; that is, no "gold standard." A change-point process with a Weibull survival hazard function was used to model the progression of the hidden disease status. The model adjusted for the fixed effects of covariate variables and random effects of subject on the diagnostic testing procedure. Markov chain Monte Carlo methods were used to compute the posterior estimates of the model parameters that provide the basis for inference concerning the accuracy of the diagnostic procedure. Based on the Bayesian approach, the posterior probability distribution of the change-point onset time can be obtained and used as a criterion for infection diagnosis. An application is presented to an analysis of ELISA and fecal culture test outcomes in the diagnostic testing of paratuberculosis (Johne's disease) for a Danish longitudinal study from January 2000 to March 2003. The posterior probability criterion based on the Bayesian model with 4 repeated observations has an area under the receiver operating characteristic curve (AUC) of 0.984, and is superior to the raw ELISA (AUC=0.911) and fecal culture (sensitivity=0.358, specificity=0.980) tests for Johne's disease diagnosis. PMID:21524521

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....

  4. Bayesian Word Sense Induction

    OpenAIRE

    Brody, Samuel; Lapata, Mirella

    2009-01-01

    Sense induction seeks to automatically identify word senses directly from a corpus. A key assumption underlying previous work is that the context surrounding an ambiguous word is indicative of its meaning. Sense induction is thus typically viewed as an unsupervised clustering problem where the aim is to partition a word’s contexts into different classes, each representing a word sense. Our work places sense induction in a Bayesian context by modeling the contexts of the ambiguous word as samp...

  5. Bayesian Generalized Rating Curves

    OpenAIRE

    Helgi Sigurðarson 1985

    2014-01-01

    A rating curve is a curve or a model that describes the relationship between water elevation, or stage, and discharge in an observation site in a river. The rating curve is fit from paired observations of stage and discharge. The rating curve then predicts discharge given observations of stage and this methodology is applied as stage is substantially easier to directly observe than discharge. In this thesis a statistical rating curve model is proposed working within the framework of Bayesian...

  6. Efficient Bayesian Phase Estimation

    Science.gov (United States)

    Wiebe, Nathan; Granade, Chris

    2016-07-01

    We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.

  7. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  8. Bayesian Attractor Learning

    Science.gov (United States)

    Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory

    2016-04-01

    Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.

  9. Evaluation of the image quality of ink-jet printed paper copies of digital chest radiographs as compared with film: a receiver operating characteristic study.

    Science.gov (United States)

    Lyttkens, K; Kirkhorn, T; Kehler, M; Andersson, B; Ebbesen, A; Hochbergs, P; Jarlman, O; Lindberg, C G; Holmer, N G

    1994-05-01

    Paper copies of digital radiographs printed with the continuous ink-jet technique have proved to be of a high enough quality for demonstration purposes. We present a study on the image quality of ink-jet printed paper copies of digital chest radiographs, based on receiver operating characteristic (ROC) analysis. Eighty-three digital radiographs of a chest phantom with simulated tumors in the mediastinum and right lung, derived from a computed radiography (CR) system were presented in two series of hard copies as ink-jet printed paper copies and as laser recorded film. The images, with a matrix of 1,760 x 2,140 pixels, were printed with a spatial resolution of 10 pixels/mm in the CR film recorder as well as in the ink-jet printer. On film, every image was recorded in two versions, one optimized for the mediastinum and one for the lungs. On paper, only one image was printed; this constituted an effort to optimize both the mediastinum and the lungs. The ink-jet printed images, printed on a matt coated paper, were viewed as on-sight images with reflected light. The examinations were reviewed by six radiologists, and ROC curves were constructed. No significant difference was found between the performance of film and that of ink-jet paper prints. Because the cost for a paper copy is only a tenth of that of film, remarkable cost reductions can be achieved by using the ink jet technique instead. Our results show that further quality studies of ink-jet printed images are worthwhile.

  10. Delineating a Retesting Zone Using Receiver Operating Characteristic Analysis on Serial QuantiFERON Tuberculosis Test Results in US Healthcare Workers

    Directory of Open Access Journals (Sweden)

    Wendy Thanassi

    2012-01-01

    Full Text Available Objective. To find a statistically significant separation point for the QuantiFERON Gold In-Tube (QFT interferon gamma release assay that could define an optimal “retesting zone” for use in serially tested low-risk populations who have test “reversions” from initially positive to subsequently negative results. Method. Using receiver operating characteristic analysis (ROC to analyze retrospective data collected from 3 major hospitals, we searched for predictors of reversion until statistically significant separation points were revealed. A confirmatory regression analysis was performed on an additional sample. Results. In 575 initially positive US healthcare workers (HCWs, 300 (52.2% had reversions, while 275 (47.8% had two sequential positive tests. The most statistically significant (Kappa = 0.48, chi-square = 131.0, P<0.001 separation point identified by the ROC for predicting reversion was the tuberculosis antigen minus-nil (TBag-nil value at 1.11 International Units per milliliter (IU/mL. The second separation point was found at TBag-nil at 0.72 IU/mL (Kappa = 0.16, chi-square = 8.2, P<0.01. The model was validated by the regression analysis of 287 HCWs. Conclusion. Reversion likelihood increases as the TBag-nil approaches the manufacturer's cut-point of 0.35 IU/mL. The most statistically significant separation point between those who test repeatedly positive and those who revert is 1.11 IU/mL. Clinicians should retest low-risk individuals with initial QFT results < 1.11 IU/mL.

  11. A receiver operated curve-based evaluation of change in sensitivity and specificity of cotinine urinalysis for detecting active tobacco use

    Directory of Open Access Journals (Sweden)

    Yatan Pal Singh Balhara

    2013-01-01

    Full Text Available Background: Tobacco use has been associated with various carcinomas including lung, esophagus, larynx, mouth, throat, kidney, bladder, pancreas, stomach, and cervix. Biomarkers such as concentration of cotinine in the blood, urine, or saliva have been used as objective measures to distinguish nonusers and users of tobacco products. A change in the cut-off value of urinary cotinine to detect active tobacco use is associated with a change in sensitivity and sensitivity of detection. Aim: The current study aimed at assessing the impact of using different cut-off thresholds of urinary cotinine on sensitivity and specificity of detection of smoking and smokeless tobacco product use among psychiatric patients. Settings and Design: All the male subjects attending the psychiatry out-patient department of the tertiary care multispecialty teaching hospital constituted the sample frame for the current study in a cross-sectionally. Materials and Methods: Quantitative urinary cotinine assay was done by using ELISA kits of Calbiotech. Inc., USA. We used the receiver operating characteristic (ROC curve to assess the sensitivity and specificity of various cut-off values of urinary cotinine to identify active smokers and users of smokeless tobacco products. Results: ROC analysis of urinary cotinine levels in detection of self-reported smoking provided the area under curve (AUC of 0.434. Similarly, the ROC analysis of urinary cotinine levels in detection of self-reported smoking revealed AUC of 0.44. The highest sensitivity and specificity of 100% for smoking were detected at the urinary cut-off value greater than or equal to 2.47 ng/ml. Conclusions: The choice of cut-off value of urinary cotinine used to distinguish nonusers form active users of tobacco products impacts the sensitivity as well as specificity of detection.

  12. Bayesian-based localization in inhomogeneous transmission media

    DEFF Research Database (Denmark)

    Nadimi, E. S.; Blanes-Vidal, V.; Johansen, P. M.

    2013-01-01

    In this paper, we propose a novel robust probabilistic approach based on the Bayesian inference using received-signal-strength (RSS) measurements with varying path-loss exponent. We derived the probability density function (pdf) of the distance between any two sensors in the network with heteroge......In this paper, we propose a novel robust probabilistic approach based on the Bayesian inference using received-signal-strength (RSS) measurements with varying path-loss exponent. We derived the probability density function (pdf) of the distance between any two sensors in the network...... with heterogeneous transmission medium as a function of the given RSS measurements and the characteristics of the heterogeneous medium. The results of this study show that the localization mean square error (MSE) of the Bayesian-based method outperformed all other existing localization approaches. © 2013 ACM....

  13. Bayesian optimization for materials design

    OpenAIRE

    Frazier, Peter I.; Wang, Jialei

    2015-01-01

    We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian pro...

  14. Bayesian Posteriors Without Bayes' Theorem

    CERN Document Server

    Hill, Theodore P

    2012-01-01

    The classical Bayesian posterior arises naturally as the unique solution of several different optimization problems, without the necessity of interpreting data as conditional probabilities and then using Bayes' Theorem. For example, the classical Bayesian posterior is the unique posterior that minimizes the loss of Shannon information in combining the prior and the likelihood distributions. These results, direct corollaries of recent results about conflations of probability distributions, reinforce the use of Bayesian posteriors, and may help partially reconcile some of the differences between classical and Bayesian statistics.

  15. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.

    Directory of Open Access Journals (Sweden)

    Rowena Syn Yin Wong

    Full Text Available There are not many studies that attempt to model intensive care unit (ICU risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU.This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV model. Bayesian Markov Chain Monte Carlo (MCMC simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method.The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05 for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study.Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.

  16. Implementation of an Adaptive Learning System Using a Bayesian Network

    Science.gov (United States)

    Yasuda, Keiji; Kawashima, Hiroyuki; Hata, Yoko; Kimura, Hiroaki

    2015-01-01

    An adaptive learning system is proposed that incorporates a Bayesian network to efficiently gauge learners' understanding at the course-unit level. Also, learners receive content that is adapted to their measured level of understanding. The system works on an iPad via the Edmodo platform. A field experiment using the system in an elementary school…

  17. Wideband CMOS receivers

    CERN Document Server

    Oliveira, Luis

    2015-01-01

    This book demonstrates how to design a wideband receiver operating in current mode, in which the noise and non-linearity are reduced, implemented in a low cost single chip, using standard CMOS technology.  The authors present a solution to remove the transimpedance amplifier (TIA) block and connect directly the mixer’s output to a passive second-order continuous-time Σ∆ analog to digital converter (ADC), which operates in current-mode. These techniques enable the reduction of area, power consumption, and cost in modern CMOS receivers.

  18. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  19. Computationally efficient Bayesian tracking

    Science.gov (United States)

    Aughenbaugh, Jason; La Cour, Brian

    2012-06-01

    In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.

  20. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  1. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model......This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...

  2. Highly Sensitive Photonic Crystal Cavity Laser Noise Measurements using Bayesian Filtering

    DEFF Research Database (Denmark)

    Piels, Molly; Xue, Weiqi; Schäffer, Christian G.;

    2015-01-01

    We measure for the first time the frequency noise spectrum of a photonic crystal cavity laser with less than 20 nW of fiber-coupled output power using a coherent receiver and Bayesian filtering.......We measure for the first time the frequency noise spectrum of a photonic crystal cavity laser with less than 20 nW of fiber-coupled output power using a coherent receiver and Bayesian filtering....

  3. Bayesian regression of piecewise homogeneous Poisson processes

    Directory of Open Access Journals (Sweden)

    Diego Sevilla

    2015-12-01

    Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015

  4. Bayesian networks for enterprise risk assessment

    CERN Document Server

    Bonafede, C E

    2006-01-01

    According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. In general risk is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (either qualitative or quantitative data) are used. Moreover qualitative data must be converted in numerical values to be used in the model. In the case of enterprise risk assessment the considered risks are, for instance, strategic, operational, legal and of image, which many times are difficult to be quantified. So in most cases only expert data, gathered by scorecard approaches, are available for risk analysis. The Bayesian Network is a useful tool to integrate different information and in particular to study the risk's joint distribution by using data collected from experts. In this paper we want to show a possible approach for building a Bayesian networks in the parti...

  5. QBism, the Perimeter of Quantum Bayesianism

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    This article summarizes the Quantum Bayesian point of view of quantum mechanics, with special emphasis on the view's outer edges---dubbed QBism. QBism has its roots in personalist Bayesian probability theory, is crucially dependent upon the tools of quantum information theory, and most recently, has set out to investigate whether the physical world might be of a type sketched by some false-started philosophies of 100 years ago (pragmatism, pluralism, nonreductionism, and meliorism). Beyond conceptual issues, work at Perimeter Institute is focused on the hard technical problem of finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when an agent considers gambling on the consequences of...

  6. Software Health Management with Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole; Schumann, JOhann

    2011-01-01

    Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.

  7. Distributed Detection via Bayesian Updates and Consensus

    CERN Document Server

    Liu, Qipeng; Wang, Xiaofan

    2014-01-01

    In this paper, we discuss a class of distributed detection algorithms which can be viewed as implementations of Bayes' law in distributed settings. Some of the algorithms are proposed in the literature most recently, and others are first developed in this paper. The common feature of these algorithms is that they all combine (i) certain kinds of consensus protocols with (ii) Bayesian updates. They are different mainly in the aspect of the type of consensus protocol and the order of the two operations. After discussing their similarities and differences, we compare these distributed algorithms by numerical examples. We focus on the rate at which these algorithms detect the underlying true state of an object. We find that (a) The algorithms with consensus via geometric average is more efficient than that via arithmetic average; (b) The order of consensus aggregation and Bayesian update does not apparently influence the performance of the algorithms; (c) The existence of communication delay dramatically slows do...

  8. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael;

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  9. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  10. Application of Bayesian Hierarchical Prior Modeling to Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Shutin, Dmitriy;

    2012-01-01

    Existing methods for sparse channel estimation typically provide an estimate computed as the solution maximizing an objective function defined as the sum of the log-likelihood function and a penalization term proportional to the l1-norm of the parameter of interest. However, other penalization...... terms have proven to have strong sparsity-inducing properties. In this work, we design pilot assisted channel estimators for OFDM wireless receivers within the framework of sparse Bayesian learning by defining hierarchical Bayesian prior models that lead to sparsity-inducing penalization terms...

  11. Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions

    Directory of Open Access Journals (Sweden)

    Richard M. Todd

    1988-03-01

    Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.

  12. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  13. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  14. Irregular-Time Bayesian Networks

    CERN Document Server

    Ramati, Michael

    2012-01-01

    In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...

  15. Neuronanatomy, neurology and Bayesian networks

    OpenAIRE

    Bielza Lozoya, Maria Concepcion

    2014-01-01

    Bayesian networks are data mining models with clear semantics and a sound theoretical foundation. In this keynote talk we will pinpoint a number of neuroscience problems that can be addressed using Bayesian networks. In neuroanatomy, we will show computer simulation models of dendritic trees and classification of neuron types, both based on morphological features. In neurology, we will present the search for genetic biomarkers in Alzheimer's disease and the prediction of health-related qualit...

  16. 加强应收账款管理 有效防范经营风险%Stepping up Accounts Receivable Management to Prevent Operational Risks

    Institute of Scientific and Technical Information of China (English)

    余志勇

    2012-01-01

    For the safety of fund and the prevention of capital loss,those in charge should take the accounting job seriously,divide the job clearly and make sure there are no loopholes in institutions during the pre-warning of accounts receivable.During the mid control phase,they must employ credit limit delivery,credit rating evaluation and risk confirmation in customer service.During the management of accounts receivable,those in charge should seriously determine creditors’ right,step up account checking,clear up the bad loans to minimize the loss and check the acceptance bill to ensure capital safety.%为确保资金安全,防止资产流失,在应收账款的事前预警中,领导应重视、分工应明确、制度应齐全;在应收账款的事中控制中,应实行信用限额发货;推行信用等级评价;强化售后风险确认。在应收账款管理的事后控制中,应认真落实债权,加强对应收款项的对账工作;加大坏账的清欠力度,将损失降到最低;加强承兑汇票查验,保障资金安全。

  17. Bayesian tomographic reconstruction of microsystems

    Science.gov (United States)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-11-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast). To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique. In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations.

  18. Bayesian Diagnostic Network: A Powerful Model for Representation and Reasoning of Engineering Diagnostic Knowledge

    Institute of Scientific and Technical Information of China (English)

    HU Zhao-yong

    2005-01-01

    Engineering diagnosis is essential to the operation of industrial equipment. The key to successful diagnosis is correct knowledge representation and reasoning. The Bayesian network is a powerful tool for it. This paper utilizes the Bayesian network to represent and reason diagnostic knowledge, named Bayesian diagnostic network. It provides a three-layer topologic structure based on operating conditions, possible faults and corresponding symptoms. The paper also discusses an approximate stochastic sampling algorithm. Then a practical Bayesian network for gas turbine diagnosis is constructed on a platform developed under a Visual C++ environment. It shows that the Bayesian network is a powerful model for representation and reasoning of diagnostic knowledge. The three-layer structure and the approximate algorithm are effective also.

  19. Bayesian nonparametric adaptive control using Gaussian processes.

    Science.gov (United States)

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  20. Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap

    OpenAIRE

    Dale Poirier

    2008-01-01

    This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.

  1. Dynamic Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2011-01-01

    Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...

  2. Nonparametric Bayesian Classification

    CERN Document Server

    Coram, M A

    2002-01-01

    A Bayesian approach to the classification problem is proposed in which random partitions play a central role. It is argued that the partitioning approach has the capacity to take advantage of a variety of large-scale spatial structures, if they are present in the unknown regression function $f_0$. An idealized one-dimensional problem is considered in detail. The proposed nonparametric prior uses random split points to partition the unit interval into a random number of pieces. This prior is found to provide a consistent estimate of the regression function in the $\\L^p$ topology, for any $1 \\leq p < \\infty$, and for arbitrary measurable $f_0:[0,1] \\rightarrow [0,1]$. A Markov chain Monte Carlo (MCMC) implementation is outlined and analyzed. Simulation experiments are conducted to show that the proposed estimate compares favorably with a variety of conventional estimators. A striking resemblance between the posterior mean estimate and the bagged CART estimate is noted and discussed. For higher dimensions, a ...

  3. Advanced solar thermal receiver technology

    Science.gov (United States)

    Kudirka, A. A.; Leibowitz, L. P.

    1980-01-01

    Development of advanced receiver technology for solar thermal receivers designed for electric power generation or for industrial applications, such as fuels and chemical production or industrial process heat, is described. The development of this technology is focused on receivers that operate from 1000 F to 3000 F and above. Development strategy is mapped in terms of application requirements, and the related system and technical requirements. Receiver performance requirements and current development efforts are covered for five classes of receiver applications: high temperature, advanced Brayton, Stirling, and Rankine cycle engines, and fuels and chemicals.

  4. Bayesian quantum frequency estimation in presence of collective dephasing

    International Nuclear Information System (INIS)

    We advocate a Bayesian approach to optimal quantum frequency estimation—an important issue for future quantum enhanced atomic clock operation. The approach provides a clear insight into the interplay between decoherence and the extent of prior knowledge in determining the optimal interrogation times and optimal estimation strategies. We propose a general framework capable of describing local oscillator noise as well as additional collective atomic dephasing effects. For a Gaussian noise, the average Bayesian cost can be expressed using the quantum Fisher information. Thus we establish a direct link between the two, often competing, approaches to quantum estimation theory. (paper)

  5. Bayesian Discovery of Linear Acyclic Causal Models

    CERN Document Server

    Hoyer, Patrik O

    2012-01-01

    Methods for automated discovery of causal relationships from non-interventional data have received much attention recently. A widely used and well understood model family is given by linear acyclic causal models (recursive structural equation models). For Gaussian data both constraint-based methods (Spirtes et al., 1993; Pearl, 2000) (which output a single equivalence class) and Bayesian score-based methods (Geiger and Heckerman, 1994) (which assign relative scores to the equivalence classes) are available. On the contrary, all current methods able to utilize non-Gaussianity in the data (Shimizu et al., 2006; Hoyer et al., 2008) always return only a single graph or a single equivalence class, and so are fundamentally unable to express the degree of certainty attached to that output. In this paper we develop a Bayesian score-based approach able to take advantage of non-Gaussianity when estimating linear acyclic causal models, and we empirically demonstrate that, at least on very modest size networks, its accur...

  6. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  7. RFI receiver. [deep space network

    Science.gov (United States)

    Lay, R.

    1980-01-01

    An S-band radio frequency interference (RFI) receiver to analyze and identify sources of RFI problems in the Deep Space Network DSN tracking stations is described. The RFI receiver is a constant gain, double conversion, open loop receiver with dual sine/cosine channel outputs, providing a total of 20 MHZ monitoring capability. This receiver is computer controlled using a MODCOMP II miniprocessor. The RFI receiver has been designed to operate at a 150 Kelvin system noise temperature accomplished by cascading two low noise field effect transistor (FET) amplifiers for the receiver front-end. The first stage low noise FET amplifier is mounted at the feed horn to minimize any cable losses to achieve a lower system noise temperature. The receiver is tunable over the frequency range of 2150 to 2450 MHz in both sine/cosine output channels with a resolution of 100 kHz.

  8. Olympus beacon receiver

    Science.gov (United States)

    Ostergaard, Jens

    1988-01-01

    A medium-size Beacon Receiving System for reception and processing of the B1 (20 GHz) and B2 (30 GHz) beacons from Olympus has been developed. Integration of B1 and B2 receiving equipment into one system using one antenna and a common computer for control and data processing provides the advantages of a compact configuration and synchronization of the two receiver chains. Range for co-polar signal attenuation meaurement is about 30 dB for both beacons, increasing to 40 dB for B2 if the receivers are synchronized to B1. The accuracy is better than 0.5 dB. Cross-polarization discriminations of the order of 10 to 30 dB may be determined with an accuracy of 1 to 2 dB. A number of radiometers for complementary measurements of atmospheric attenuation of 13 to 30 GHz has also been constructed. A small multi-frequency system for operation around 22 GHz and 31 GHz is presently under development.

  9. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...

  10. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  11. Bayesian multivariate mixed-scale density estimation

    CERN Document Server

    Canale, Antonio

    2011-01-01

    Although univariate continuous density estimation has received abundant attention in the Bayesian nonparametrics literature, there is essentially no theory on multivariate mixed scale density estimation. In this article, we consider a general framework to jointly model continuous, count and categorical variables under a nonparametric prior, which is induced through rounding latent variables having an unknown density with respect to Lesbesgue measure. For the proposed class of priors, we provide sufficient conditions for large support, strong consistency and rates of posterior contraction. These conditions, which primarily relate to the prior on the latent variable density and heaviness of the tails for the observed continuous variables, allow one to convert sufficient conditions obtained in the setting of multivariate continuous density estimation to the mixed scale case. We provide new results in the multivariate continuous density estimation case, showing the Kullback-Leibler property and strong consistency...

  12. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  13. A Bayesian Game-Theoretic Approach for Distributed Resource Allocation in Fading Multiple Access Channels

    Directory of Open Access Journals (Sweden)

    Gaoning He

    2010-01-01

    Full Text Available A Bayesian game-theoretic model is developed to design and analyze the resource allocation problem in K-user fading multiple access channels (MACs, where the users are assumed to selfishly maximize their average achievable rates with incomplete information about the fading channel gains. In such a game-theoretic study, the central question is whether a Bayesian equilibrium exists, and if so, whether the network operates efficiently at the equilibrium point. We prove that there exists exactly one Bayesian equilibrium in our game. Furthermore, we study the network sum-rate maximization problem by assuming that the users coordinate according to a symmetric strategy profile. This result also serves as an upper bound for the Bayesian equilibrium. Finally, simulation results are provided to show the network efficiency at the unique Bayesian equilibrium and to compare it with other strategies.

  14. Bayesian test and Kuhn's paradigm

    Institute of Scientific and Technical Information of China (English)

    Chen Xiaoping

    2006-01-01

    Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.

  15. Perception, illusions and Bayesian inference.

    Science.gov (United States)

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.

  16. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  17. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  18. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    C. Dimitrakakis

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st

  19. Bayesian networks and food security - An introduction

    NARCIS (Netherlands)

    Stein, A.

    2004-01-01

    This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup

  20. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  1. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  2. Bayesian Inference and Online Learning in Poisson Neuronal Networks.

    Science.gov (United States)

    Huang, Yanping; Rao, Rajesh P N

    2016-08-01

    Motivated by the growing evidence for Bayesian computation in the brain, we show how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model. The lower-layer sensory neurons receive noisy measurements of hidden world states. The higher-layer neurons infer a posterior distribution over world states via Bayesian inference from inputs generated by sensory neurons. We demonstrate how such a neuronal network with synaptic plasticity can implement a form of Bayesian inference similar to Monte Carlo methods such as particle filtering. Each spike in a higher-layer neuron represents a sample of a particular hidden world state. The spiking activity across the neural population approximates the posterior distribution over hidden states. In this model, variability in spiking is regarded not as a nuisance but as an integral feature that provides the variability necessary for sampling during inference. We demonstrate how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule. We present results illustrating the ability of the network to perform inference and learning for arbitrary hidden Markov models.

  3. Electronic warfare receivers and receiving systems

    CERN Document Server

    Poisel, Richard A

    2014-01-01

    Receivers systems are considered the core of electronic warfare (EW) intercept systems. Without them, the fundamental purpose of such systems is null and void. This book considers the major elements that make up receiver systems and the receivers that go in them.This resource provides system design engineers with techniques for design and development of EW receivers for modern modulations (spread spectrum) in addition to receivers for older, common modulation formats. Each major module in these receivers is considered in detail. Design information is included as well as performance tradeoffs o

  4. Bayesian Analysis of High Dimensional Classification

    Science.gov (United States)

    Mukhopadhyay, Subhadeep; Liang, Faming

    2009-12-01

    Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.

  5. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...

  6. Bayesian NL interpretation and learning

    NARCIS (Netherlands)

    H. Zeevat

    2011-01-01

    Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language

  7. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  8. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  9. Bayesian stable isotope mixing models

    Science.gov (United States)

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...

  10. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  11. 3-D contextual Bayesian classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...

  12. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  13. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  14. Bayesian Analysis of Experimental Data

    Directory of Open Access Journals (Sweden)

    Lalmohan Bhar

    2013-10-01

    Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.

  15. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  16. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  17. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  18. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have...... been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...

  19. Bayesian Networks as a Decision Tool for O&M of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Nielsen, Jannie Jessen; Sørensen, John Dalsgaard

    2010-01-01

    Costs to operation and maintenance (O&M) of offshore wind turbines are large. This paper presents how influence diagrams can be used to assist in rational decision making for O&M. An influence diagram is a graphical representation of a decision tree based on Bayesian Networks. Bayesian Networks...... offer efficient Bayesian updating of a damage model when imperfect information from inspections/monitoring is available. The extension to an influence diagram offers the calculation of expected utilities for decision alternatives, and can be used to find the optimal strategy among different alternatives...

  20. Bayesian Ranging for Radio Localization with and without Line-of-Sight Detection

    DEFF Research Database (Denmark)

    Jing, Lishuai; Pedersen, Troels; Fleury, Bernard Henri

    2015-01-01

    We consider Bayesian ranging methods for local- ization in wireless communication systems. Based on a channel model and given priors for the range and the line-of-sight (LOS) condition, we propose range estimators with and without LOS detection. Since the pdf of the received frequency-domain sign......We consider Bayesian ranging methods for local- ization in wireless communication systems. Based on a channel model and given priors for the range and the line-of-sight (LOS) condition, we propose range estimators with and without LOS detection. Since the pdf of the received frequency...

  1. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  2. Flexible Bayesian Nonparametric Priors and Bayesian Computational Methods

    OpenAIRE

    Zhu, Weixuan

    2016-01-01

    The definition of vectors of dependent random probability measures is a topic of interest in Bayesian nonparametrics. They represent dependent nonparametric prior distributions that are useful for modelling observables for which specific covariate values are known. Our first contribution is the introduction of novel multivariate vectors of two-parameter Poisson-Dirichlet process. The dependence is induced by applying a L´evy copula to the marginal L´evy intensities. Our attenti...

  3. Customizable Digital Receivers for Radar

    Science.gov (United States)

    Moller, Delwyn; Heavey, Brandon; Sadowy, Gregory

    2008-01-01

    Compact, highly customizable digital receivers are being developed for the system described in 'Radar Interferometer for Topographic Mapping of Glaciers and Ice Sheets' (NPO-43962), NASA Tech Briefs, Vol. 31, No. 7 (August 2007), page 72. The receivers are required to operate in unison, sampling radar returns received by the antenna elements in a digital beam-forming (DBF) mode. The design of these receivers could also be adapted to commercial radar systems. At the time of reporting the information for this article, there were no commercially available digital receivers capable of satisfying all of the operational requirements and compact enough to be mounted directly on the antenna elements. A provided figure depicts the overall system of which the digital receivers are parts. Each digital receiver includes an analog-to-digital converter (ADC), a demultiplexer (DMUX), and a field-programmable gate array (FPGA). The ADC effects 10-bit band-pass sampling of input signals having frequencies up to 3.5 GHz. The input samples are demultiplexed at a user-selectable rate of 1:2 or 1:4, then buffered in part of the FPGA that functions as a first-in/first-out (FIFO) memory. Another part of the FPGA serves as a controller for the ADC, DMUX, and FIFO memory and as an interface between (1) the rest of the receiver and (2) a front-panel data port (FPDP) bus, which is an industry-standard parallel data bus that has a high data-rate capability and multichannel configuration suitable for DBF. Still other parts of the FPGA in each receiver perform signal-processing functions. The digital receivers can be configured to operate in a stand-alone mode, or in a multichannel mode as needed for DBF. The customizability of the receiver makes it applicable to a broad range of system architectures. The capability for operation of receivers in either a stand-alone or a DBF mode enables the use of the receivers in an unprecedentedly wide variety of radar systems.

  4. Bayesian versus 'plain-vanilla Bayesian' multitarget statistics

    Science.gov (United States)

    Mahler, Ronald P. S.

    2004-08-01

    Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."

  5. Bayesian phylogeny analysis via stochastic approximation Monte Carlo

    KAUST Repository

    Cheon, Sooyoung

    2009-11-01

    Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.

  6. Exploiting Agent and Type Independence in Collaborative Graphical Bayesian Games

    CERN Document Server

    Oliehoek, Frans A; Spaan, Matthijs T J

    2011-01-01

    Efficient collaborative decision making is an important challenge for multiagent systems. Finding optimal joint actions is especially challenging when each agent has only imperfect information about the state of its environment. Such problems can be modeled as collaborative Bayesian games in which each agent receives private information in the form of its type. However, representing and solving such games requires space and computation time exponential in the number of agents. This article introduces collaborative graphical Bayesian games (CGBGs), which facilitate more efficient collaborative decision making by decomposing the global payoff function as the sum of local payoff functions that depend on only a few agents. We propose a framework for the efficient solution of CGBGs based on the insight that they posses two different types of independence, which we call agent independence and type independence. In particular, we present a factor graph representation that captures both forms of independence and thus...

  7. Bayesian inference of chemical kinetic models from proposed reactions

    KAUST Repository

    Galagali, Nikhil

    2015-02-01

    © 2014 Elsevier Ltd. Bayesian inference provides a natural framework for combining experimental data with prior knowledge to develop chemical kinetic models and quantify the associated uncertainties, not only in parameter values but also in model structure. Most existing applications of Bayesian model selection methods to chemical kinetics have been limited to comparisons among a small set of models, however. The significant computational cost of evaluating posterior model probabilities renders traditional Bayesian methods infeasible when the model space becomes large. We present a new framework for tractable Bayesian model inference and uncertainty quantification using a large number of systematically generated model hypotheses. The approach involves imposing point-mass mixture priors over rate constants and exploring the resulting posterior distribution using an adaptive Markov chain Monte Carlo method. The posterior samples are used to identify plausible models, to quantify rate constant uncertainties, and to extract key diagnostic information about model structure-such as the reactions and operating pathways most strongly supported by the data. We provide numerical demonstrations of the proposed framework by inferring kinetic models for catalytic steam and dry reforming of methane using available experimental data.

  8. Bayesian Grammar Induction for Language Modeling

    CERN Document Server

    Chen, S F

    1995-01-01

    We describe a corpus-based induction algorithm for probabilistic context-free grammars. The algorithm employs a greedy heuristic search within a Bayesian framework, and a post-pass using the Inside-Outside algorithm. We compare the performance of our algorithm to n-gram models and the Inside-Outside algorithm in three language modeling tasks. In two of the tasks, the training data is generated by a probabilistic context-free grammar and in both tasks our algorithm outperforms the other techniques. The third task involves naturally-occurring data, and in this task our algorithm does not perform as well as n-gram models but vastly outperforms the Inside-Outside algorithm. From no-reply@xxx.lanl.gov Thu Nov 11 08:58 MET 1999 Received: from newmint.cern.ch (newmint.cern.ch [137.138.26.94]) by sundh98.cern.ch (8.8.5/8.8.5) with ESMTP id IAA20556 for ; Thu, 11 Nov 1999 08:58:51 +0100 (MET) Received: from uuu.lanl.gov (uuu.lanl.gov [204.121.6.59]) by newmint.cern.ch (8.9.3/8.9.3) with ESMTP id IAA02938 for ; Thu, 11...

  9. Bayesian inference on proportional elections.

    Science.gov (United States)

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  10. Bayesian approach to rough set

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.

  11. Bayesian priors for transiting planets

    CERN Document Server

    Kipping, David M

    2016-01-01

    As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...

  12. Bayesian Source Separation and Localization

    CERN Document Server

    Knuth, K H

    1998-01-01

    The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...

  13. Bayesian Inference for Radio Observations

    CERN Document Server

    Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin

    2015-01-01

    (Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...

  14. A Bayesian Nonparametric IRT Model

    OpenAIRE

    Karabatsos, George

    2015-01-01

    This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...

  15. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  16. Bayesian kinematic earthquake source models

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.

    2009-12-01

    Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high

  17. Bayesian Stable Isotope Mixing Models

    OpenAIRE

    Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard

    2012-01-01

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...

  18. Bayesian Network--Response Regression

    OpenAIRE

    WANG, LU; Durante, Daniele; Dunson, David B.

    2016-01-01

    There is an increasing interest in learning how human brain networks vary with continuous traits (e.g., personality, cognitive abilities, neurological disorders), but flexible procedures to accomplish this goal are limited. We develop a Bayesian semiparametric model, which combines low-rank factorizations and Gaussian process priors to allow flexible shifts of the conditional expectation for a network-valued random variable across the feature space, while including subject-specific random eff...

  19. Bayesian segmentation of hyperspectral images

    CERN Document Server

    Mohammadpour, Adel; Mohammad-Djafari, Ali

    2007-01-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  20. Bayesian segmentation of hyperspectral images

    Science.gov (United States)

    Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali

    2004-11-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  1. Bayesian analysis of contingency tables

    OpenAIRE

    Gómez Villegas, Miguel A.; González Pérez, Beatriz

    2005-01-01

    The display of the data by means of contingency tables is used in different approaches to statistical inference, for example, to broach the test of homogeneity of independent multinomial distributions. We develop a Bayesian procedure to test simple null hypotheses versus bilateral alternatives in contingency tables. Given independent samples of two binomial distributions and taking a mixed prior distribution, we calculate the posterior probability that the proportion of successes in the first...

  2. Bayesian estimation of turbulent motion

    OpenAIRE

    Héas, P.; Herzet, C.; Mémin, E.; Heitz, D.; P. D. Mininni

    2013-01-01

    International audience Based on physical laws describing the multi-scale structure of turbulent flows, this article proposes a regularizer for fluid motion estimation from an image sequence. Regularization is achieved by imposing some scale invariance property between histograms of motion increments computed at different scales. By reformulating this problem from a Bayesian perspective, an algorithm is proposed to jointly estimate motion, regularization hyper-parameters, and to select the ...

  3. Bayesian Kernel Mixtures for Counts

    OpenAIRE

    Canale, Antonio; David B Dunson

    2011-01-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviatio...

  4. Bayesian second law of thermodynamics.

    Science.gov (United States)

    Bartolotta, Anthony; Carroll, Sean M; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+〈Q〉_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and 〈Q〉_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples. PMID:27627241

  5. Bayesian second law of thermodynamics

    Science.gov (United States)

    Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.

  6. Target distribution in cooperative combat based on Bayesian optimization algorithm

    Institute of Scientific and Technical Information of China (English)

    Shi Zhifu; Zhang An; Wang Anli

    2006-01-01

    Target distribution in cooperative combat is a difficult and emphases. We build up the optimization model according to the rule of fire distribution. We have researched on the optimization model with BOA. The BOA can estimate the joint probability distribution of the variables with Bayesian network, and the new candidate solutions also can be generated by the joint distribution. The simulation example verified that the method could be used to solve the complex question, the operation was quickly and the solution was best.

  7. Bayesian quantum frequency estimation in presence of collective dephasing

    OpenAIRE

    Macieszczak, Katarzyna; Fraas, Martin; Demkowicz-Dobrzanski, Rafal

    2013-01-01

    We advocate a Bayesian approach to optimal quantum frequency estimation - an important issue for future quantum enhanced atomic clock operation. The approach provides a clear insight into the interplay between decoherence and the extent of the prior knowledge in determining the optimal interrogation times and optimal estimation strategies. We propose a general framework capable of describing local oscillator noise as well as additional collective atomic dephasing effects. For a Gaussian noise...

  8. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  9. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  10. Bayesian Posterior Distributions Without Markov Chains

    OpenAIRE

    Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.

    2012-01-01

    Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...

  11. Bayesian calibration of simultaneity in audiovisual temporal order judgments.

    Directory of Open Access Journals (Sweden)

    Shinya Yamamoto

    Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.

  12. Variational bayesian method of estimating variance components.

    Science.gov (United States)

    Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi

    2016-07-01

    We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.

  13. Exatidão de posicionamento de um receptor GPS, operando sob diferentes coberturas vegetais Evaluation of the accuracy of positioning a GPS receiver operating under different vegetation covers

    Directory of Open Access Journals (Sweden)

    Rubens Angulo Filho

    2002-01-01

    Full Text Available Para avaliar a exatidão de posicionamento planimétrico do receptor GPS Trimble/Pro-XL, operando sob diferentes condições de cobertura vegetal (pastagem, seringueira, eucalipto e pinus, o equipamento foi posicionado alternadamente sobre 6 pontos, locados ao acaso nas áreas de estudo, variando o tempo de permanência (1 , 5 e 10 min mas com a mesma taxa de aquisição de dados (1 s fazendo-se, posteriormente, a correção diferencial (DGPS pós-processada dos dados. Os pontos também tiveram suas coordenadas levantadas pelo método topográfico, segundo a NBR 13133 - Execução de Levantamento Topográfico, para fins de comparação. De acordo com o método empregado e os resultados obtidos, foi possível separar as exatidões de posicionamento planimétrico, conforme o tipo de cobertura vegetal, em dois grupos: sem e com cobertura arbórea confirmando, assim, a interferência do dossel na recepção dos sinais emitidos pelos satélites GPS. O aumento do tempo de permanência melhorou a exatidão de posicionamento planimétrico, o que ratifica a escolha da metodologia de levantamento como sendo fundamental para a obtenção de bons resultados de posicionamento.To evaluate planimetric positioning accuracy of a GPS receiver (Trimble/Pro-XL, operating under different conditions of vegetation cover (pasture, rubber trees, eucalyptus and pine trees, 6 control points were located randomly in the study area. For comparison, their coordinates were first obtained by a conventional surveying method, according to NBR 13133 of Brazilian Surveying Standards. Afterwards, the GPS receiver was positioned on those control points, maintaining the acquisition rate of 1 s while changing the time for 1, 5 and 10 min, the DGPS method was used to correct the positioning coordinate data. According to the methodology applied and the results obtained, it was possible to distinguish planimetric positioning accuracy, according to the vegetation cover, in two groups

  14. 49 CFR 393.88 - Television receivers.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Television receivers. 393.88 Section 393.88... NECESSARY FOR SAFE OPERATION Miscellaneous Parts and Accessories § 393.88 Television receivers. Any motor vehicle equipped with a television viewer, screen or other means of visually receiving a...

  15. Exemplar models as a mechanism for performing Bayesian inference.

    Science.gov (United States)

    Shi, Lei; Griffiths, Thomas L; Feldman, Naomi H; Sanborn, Adam N

    2010-08-01

    Probabilistic models have recently received much attention as accounts of human cognition. However, most research in which probabilistic models have been used has been focused on formulating the abstract problems behind cognitive tasks and their optimal solutions, rather than on mechanisms that could implement these solutions. Exemplar models are a successful class of psychological process models in which an inventory of stored examples is used to solve problems such as identification, categorization, and function learning. We show that exemplar models can be used to perform a sophisticated form of Monte Carlo approximation known as importance sampling and thus provide a way to perform approximate Bayesian inference. Simulations of Bayesian inference in speech perception, generalization along a single dimension, making predictions about everyday events, concept learning, and reconstruction from memory show that exemplar models can often account for human performance with only a few exemplars, for both simple and relatively complex prior distributions. These results suggest that exemplar models provide a possible mechanism for implementing at least some forms of Bayesian inference. PMID:20702863

  16. Soft-In Soft-Output Detection in the Presence of Parametric Uncertainty via the Bayesian EM Algorithm

    Directory of Open Access Journals (Sweden)

    Gallo A. S.

    2005-01-01

    Full Text Available We investigate the application of the Bayesian expectation-maximization (BEM technique to the design of soft-in soft-out (SISO detection algorithms for wireless communication systems operating over channels affected by parametric uncertainty. First, the BEM algorithm is described in detail and its relationship with the well-known expectation-maximization (EM technique is explained. Then, some of its applications are illustrated. In particular, the problems of SISO detection of spread spectrum, single-carrier and multicarrier space-time block coded signals are analyzed. Numerical results show that BEM-based detectors perform closely to the maximum likelihood (ML receivers endowed with perfect channel state information as long as channel variations are not too fast.

  17. A Bayesian Approach for Nonlinear Structural Equation Models with Dichotomous Variables Using Logit and Probit Links

    Science.gov (United States)

    Lee, Sik-Yum; Song, Xin-Yuan; Cai, Jing-Heng

    2010-01-01

    Analysis of ordered binary and unordered binary data has received considerable attention in social and psychological research. This article introduces a Bayesian approach, which has several nice features in practical applications, for analyzing nonlinear structural equation models with dichotomous data. We demonstrate how to use the software…

  18. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  19. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  20. Bayesian Query-Focused Summarization

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.

  1. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept...... of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...

  2. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  3. Thermal resistance model for CSP central receivers

    Science.gov (United States)

    de Meyer, O. A. J.; Dinter, F.; Govender, S.

    2016-05-01

    The receiver design and heliostat field aiming strategy play a vital role in the heat transfer efficiency of the receiver. In molten salt external receivers, the common operating temperature of the heat transfer fluid or molten salt ranges between 285°C to 565°C. The optimum output temperature of 565°C is achieved by adjusting the mass flow rate of the molten salt through the receiver. The reflected solar radiation onto the receiver contributes to the temperature rise in the molten salt by means of heat transfer. By investigating published work on molten salt external receiver operating temperatures, corresponding receiver tube surface temperatures and heat losses, a model has been developed to obtain a detailed thermographic representation of the receiver. The steady state model uses a receiver flux map as input to determine: i) heat transfer fluid mass flow rate through the receiver to obtain the desired molten salt output temperature of 565°C, ii) receiver surface temperatures iii) receiver tube temperatures iv) receiver efficiency v) pressure drop across the receiver and vi) corresponding tube strain per panel.

  4. 恶性肿瘤患者术后化疗期创伤后成长状况调查%Survey on post traumatic growth of malignant tumor patients receiving post-operative chemotherapy

    Institute of Scientific and Technical Information of China (English)

    汪娟; 张平; 宋旭红; 李晓燕

    2012-01-01

    Objective io understand post traumatic growth 01 malignant tumor patients receiving post-operative chemotherapy, and provide evidence for nurses to adopt psychological care to patients with malignant tumor. Methods A total of 230 malignant tumor patients receiving post-operative chemotherapy were investigated in terms of their demographic data and post traumatic growth. Results The total score of Post Traumatic Growth Inventory (PTGD of malignant tumor patients was 67. 33 + 14. 17, with the score of Appreciation of Life being the highest, followed by Spiritual Change, Relating to Others, Personal Strength, and New Possibilities. PTGI score had significant differences between genders, age groups, marital status, residential places, disease conditions, and among varied monthly incomes and courses of disease (P<0. 05, P<0. 01). Conclusion Post traumatic growth in malignant tumor patients receiving post-operative chemotherapy is at moderate level. Gender, ages, social support and condition of illness are precipitating factors of PTG. Nurses should be able to recognize positive mental changes among malignant tumor patients and take individualized psychological measures.%目的 了解恶性肿瘤患者术后化疗期创伤后成长状况,为恶性肿瘤患者术后的心理干预提供依据.方法 对230例恶性肿瘤术后化疗期患者进行一般情况和创伤后成长状况调查,并进行统计分析.结果 恶性肿瘤术后化疗期患者创伤后成长总均分为67.33±14.17,欣赏生活维度条目均分最高,其次是精神改变、人际关系、个人增强,新的可能性均分最低.不同性别、年龄、婚姻状况、居住状况、月收入、病情、病程等患者的创伤后成长评分比较,差异有统计学意义(P<0.05,P<0.01).结论 恶性肿瘤术后化疗期患者呈中等程度的创伤后成长水平,性别、年龄、社会支持、病情等为其影响因素.肿瘤患者的护理过程中,要看到患者心理的积极改变,

  5. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  6. Bayesian analysis of exoplanet and binary orbits

    OpenAIRE

    Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas

    2012-01-01

    We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.

  7. Bayesian credible interval construction for Poisson statistics

    Institute of Scientific and Technical Information of China (English)

    ZHU Yong-Sheng

    2008-01-01

    The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.

  8. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  9. Advances in Bayesian Modeling in Educational Research

    Science.gov (United States)

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  10. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned...

  11. The Bayesian Revolution Approaches Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2007-01-01

    This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…

  12. Bayesian Network for multiple hypthesis tracking

    NARCIS (Netherlands)

    W.P. Zajdel; B.J.A. Kröse

    2002-01-01

    For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a produ

  13. Ensemble Bayesian forecasting system Part I: Theory and algorithms

    Science.gov (United States)

    Herr, Henry D.; Krzysztofowicz, Roman

    2015-05-01

    The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of

  14. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  15. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.

  16. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data. PMID:26776199

  17. Advances in SIS receiver technology

    Science.gov (United States)

    Frerking, M. A.

    1988-01-01

    Significant advances in SIS receiver technology since the last Asilomar meeting include: superconductor materials, integrated inductive tuning elements, and planar mounting structures. The effect of these advances is to push the upper frequency operating limit from about 600 to 1500 GHz, and to enhance the feasibility of focal plane arrays of heterodyne receivers. A fundamental high frequency operating limit of SIS mixers is set by the superconducting energy gap. A practical limitation for high frequency operation of SIS junctions is their parasitic capacitance and resistance. The performance of the mixer will be degraded by the Resistor-Capacitor rolloff. Several designs were reported for inductive elements integrated on the same substrate as the SIS junctions to tune out the bulk junction capacitance. Most millimeter SIS-based heterodyne receivers have used waveguide coupling structures. Technology has advanced to the state where programs that have a high probability of success can be defined to produce arrays of SIS receivers for frequencies as high as 1500 GHz.

  18. Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations

    Science.gov (United States)

    Chen, Peng; Schwab, Christoph

    2016-07-01

    We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov-Galerkin high-fidelity ("HiFi") discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by the so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data assimilation

  19. Solar dynamic heat receiver technology

    Science.gov (United States)

    Sedgwick, Leigh M.

    1991-01-01

    A full-size, solar dynamic heat receiver was designed to meet the requirements specified for electrical power modules on the U.S. Space Station, Freedom. The heat receiver supplies thermal energy to power a heat engine in a closed Brayton cycle using a mixture of helium-xenon gas as the working fluid. The electrical power output of the engine, 25 kW, requires a 100 kW thermal input throughout a 90 minute orbit, including when the spacecraft is eclipsed for up to 36 minutes from the sun. The heat receiver employs an integral thermal energy storage system utilizing the latent heat available through the phase change of a high-temperature salt mixture. A near eutectic mixture of lithium fluoride and calcium difluoride is used as the phase change material. The salt is contained within a felt metal matrix which enhances heat transfer and controls the salt void distribution during solidification. Fabrication of the receiver is complete and it was delivered to NASA for verification testing in a simulated low-Earth-orbit environment. This document reviews the receiver design and describes its fabrication history. The major elements required to operate the receiver during testing are also described.

  20. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    Science.gov (United States)

    Sudhan Reddy Gudur, Madhu; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-11-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10-4), 283 for the intensity approach (p = 2  ×  10-6) and 282 without density

  1. Likelihood-free inference of population structure and local adaptation in a Bayesian hierarchical model.

    Science.gov (United States)

    Bazin, Eric; Dawson, Kevin J; Beaumont, Mark A

    2010-06-01

    ABC to Bayesian hierarchical models, and we apply it to detect microsatellite loci influenced by local selection. We demonstrate using receiver operating characteristic (ROC) analysis that this approach has comparable performance to a full-likelihood method and outperforms it when mutation rates are variable across loci. PMID:20382835

  2. A Comparison of Hierarchical and Non-Hierarchical Bayesian Approaches for Fitting Allometric Larch (Larix.spp. Biomass Equations

    Directory of Open Access Journals (Sweden)

    Dongsheng Chen

    2016-01-01

    Full Text Available Accurate biomass estimations are important for assessing and monitoring forest carbon storage. Bayesian theory has been widely applied to tree biomass models. Recently, a hierarchical Bayesian approach has received increasing attention for improving biomass models. In this study, tree biomass data were obtained by sampling 310 trees from 209 permanent sample plots from larch plantations in six regions across China. Non-hierarchical and hierarchical Bayesian approaches were used to model allometric biomass equations. We found that the total, root, stem wood, stem bark, branch and foliage biomass model relationships were statistically significant (p-values < 0.001 for both the non-hierarchical and hierarchical Bayesian approaches, but the hierarchical Bayesian approach increased the goodness-of-fit statistics over the non-hierarchical Bayesian approach. The R2 values of the hierarchical approach were higher than those of the non-hierarchical approach by 0.008, 0.018, 0.020, 0.003, 0.088 and 0.116 for the total tree, root, stem wood, stem bark, branch and foliage models, respectively. The hierarchical Bayesian approach significantly improved the accuracy of the biomass model (except for the stem bark and can reflect regional differences by using random parameters to improve the regional scale model accuracy.

  3. Bayesian conformity assessment in presence of systematic measurement errors

    Science.gov (United States)

    Carobbi, Carlo; Pennecchi, Francesca

    2016-04-01

    Conformity assessment of the distribution of the values of a quantity is investigated by using a Bayesian approach. The effect of systematic, non-negligible measurement errors is taken into account. The analysis is general, in the sense that the probability distribution of the quantity can be of any kind, that is even different from the ubiquitous normal distribution, and the measurement model function, linking the measurand with the observable and non-observable influence quantities, can be non-linear. Further, any joint probability density function can be used to model the available knowledge about the systematic errors. It is demonstrated that the result of the Bayesian analysis here developed reduces to the standard result (obtained through a frequentistic approach) when the systematic measurement errors are negligible. A consolidated frequentistic extension of such standard result, aimed at including the effect of a systematic measurement error, is directly compared with the Bayesian result, whose superiority is demonstrated. Application of the results here obtained to the derivation of the operating characteristic curves used for sampling plans for inspection by variables is also introduced.

  4. Bayesian Action&Perception: Representing the World in the Brain

    Directory of Open Access Journals (Sweden)

    Gerald E. Loeb

    2014-10-01

    Full Text Available Theories of perception seek to explain how sensory data are processed to identify previously experienced objects, but they usually do not consider the decisions and effort that goes into acquiring the sensory data. Identification of objects according to their tactile properties requires active exploratory movements. The sensory data thereby obtained depend on the details of those movements, which human subjects change rapidly and seemingly capriciously. Bayesian Exploration is an algorithm that uses prior experience to decide which next exploratory movement should provide the most useful data to disambiguate the most likely possibilities. In previous studies, a simple robot equipped with a biomimetic tactile sensor and operated according to Bayesian Exploration performed in a manner similar to and actually better than humans on a texture identification task. Expanding on this, Bayesian Action&Perception refers to the construction and querying of an associative memory of previously experienced entities containing both sensory data and the motor programs that elicited them. We hypothesize that this memory can be queried i to identify useful next exploratory movements during identification of an unknown entity (action for perception or ii to characterize whether an unknown entity is fit for purpose (perception for action or iii to recall what actions might be feasible for a known entity (Gibsonian affordance. The biomimetic design of this mechatronic system may provide insights into the neuronal basis of biological action and perception.

  5. A Bayesian Reflection on Surfaces

    Directory of Open Access Journals (Sweden)

    David R. Wolf

    1999-10-01

    Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.

  6. Hedging Strategies for Bayesian Optimization

    CERN Document Server

    Brochu, Eric; de Freitas, Nando

    2010-01-01

    Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.

  7. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    , and exercises are included for the reader to check his/her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...... primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples...

  8. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    under uncertainty. The theory and methods presented are illustrated through more than 140 examples, and exercises are included for the reader to check his or her level of understanding. The techniques and methods presented on model construction and verification, modeling techniques and tricks, learning......Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...... sections, in addition to fully-updated examples, tables, figures, and a revised appendix. Intended primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning...

  9. State Information in Bayesian Games

    CERN Document Server

    Cuff, Paul

    2009-01-01

    Two-player zero-sum repeated games are well understood. Computing the value of such a game is straightforward. Additionally, if the payoffs are dependent on a random state of the game known to one, both, or neither of the players, the resulting value of the game has been analyzed under the framework of Bayesian games. This investigation considers the optimal performance in a game when a helper is transmitting state information to one of the players. Encoding information for an adversarial setting (game) requires a different result than rate-distortion theory provides. Game theory has accentuated the importance of randomization (mixed strategy), which does not find a significant role in most communication modems and source coding codecs. Higher rates of communication, used in the right way, allow the message to include the necessary random component useful in games.

  10. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks...... are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  11. Bayesian anti-sparse coding

    CERN Document Server

    Elvira, Clément; Dobigeon, Nicolas

    2015-01-01

    Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...

  12. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  13. Bayesian Kernel Mixtures for Counts.

    Science.gov (United States)

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online. PMID:22523437

  14. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  15. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  16. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  17. Low complexity MIMO receivers

    CERN Document Server

    Bai, Lin; Yu, Quan

    2014-01-01

    Multiple-input multiple-output (MIMO) systems can increase the spectral efficiency in wireless communications. However, the interference becomes the major drawback that leads to high computational complexity at both transmitter and receiver. In particular, the complexity of MIMO receivers can be prohibitively high. As an efficient mathematical tool to devise low complexity approaches that mitigate the interference in MIMO systems, lattice reduction (LR) has been widely studied and employed over the last decade. The co-authors of this book are world's leading experts on MIMO receivers, and here they share the key findings of their research over years. They detail a range of key techniques for receiver design as multiple transmitted and received signals are available. The authors first introduce the principle of signal detection and the LR in mathematical aspects. They then move on to discuss the use of LR in low complexity MIMO receiver design with respect to different aspects, including uncoded MIMO detection...

  18. Delphi Accounts Receivable Module

    Data.gov (United States)

    Department of Transportation — Delphi accounts receivable module contains the following data elements, but are not limited to customer information, cash receipts, line of accounting details, bill...

  19. Uncertainty Management for Diagnostics and Prognostics of Batteries using Bayesian Techniques

    Science.gov (United States)

    Saha, Bhaskar; Goebel, kai

    2007-01-01

    Uncertainty management has always been the key hurdle faced by diagnostics and prognostics algorithms. A Bayesian treatment of this problem provides an elegant and theoretically sound approach to the modern Condition- Based Maintenance (CBM)/Prognostic Health Management (PHM) paradigm. The application of the Bayesian techniques to regression and classification in the form of Relevance Vector Machine (RVM), and to state estimation as in Particle Filters (PF), provides a powerful tool to integrate the diagnosis and prognosis of battery health. The RVM, which is a Bayesian treatment of the Support Vector Machine (SVM), is used for model identification, while the PF framework uses the learnt model, statistical estimates of noise and anticipated operational conditions to provide estimates of remaining useful life (RUL) in the form of a probability density function (PDF). This type of prognostics generates a significant value addition to the management of any operation involving electrical systems.

  20. A Bayesian Networks in Intrusion Detection Systems

    Directory of Open Access Journals (Sweden)

    M. Mehdi

    2007-01-01

    Full Text Available Intrusion detection systems (IDSs have been widely used to overcome security threats in computer networks. Anomaly-based approaches have the advantage of being able to detect previously unknown attacks, but they suffer from the difficulty of building robust models of acceptable behaviour which may result in a large number of false alarms caused by incorrect classification of events in current systems. We propose a new approach of an anomaly Intrusion detection system (IDS. It consists of building a reference behaviour model and the use of a Bayesian classification procedure associated to unsupervised learning algorithm to evaluate the deviation between current and reference behaviour. Continuous re-estimation of model parameters allows for real time operation. The use of recursive Log-likelihood and entropy estimation as a measure for monitoring model degradation related with behavior changes and the associated model update show that the accuracy of the event classification process is significantly improved using our proposed approach for reducing the missing-alarm.

  1. Bayesian Model Selection for LISA Pathfinder

    CERN Document Server

    Karnesis, Nikolaos; Sopuerta, Carlos F; Gibert, Ferran; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Ferraioli, Luigi; Hewitson, Martin; Hueller, Mauro; Korsakova, Natalia; Plagnol, Eric; Vitale, and Stefano

    2013-01-01

    The main goal of the LISA Pathfinder (LPF) mission is to fully characterize the acceleration noise models and to test key technologies for future space-based gravitational-wave observatories similar to the LISA/eLISA concept. The Data Analysis (DA) team has developed complex three-dimensional models of the LISA Technology Package (LTP) experiment on-board LPF. These models are used for simulations, but more importantly, they will be used for parameter estimation purposes during flight operations. One of the tasks of the DA team is to identify the physical effects that contribute significantly to the properties of the instrument noise. A way of approaching to this problem is to recover the essential parameters of the LTP which describe the data. Thus, we want to define the simplest model that efficiently explains the observations. To do so, adopting a Bayesian framework, one has to estimate the so-called Bayes Factor between two competing models. In our analysis, we use three main different methods to estimate...

  2. The GBT 4mm Receiver

    Science.gov (United States)

    Frayer, David T.; White, S.; Watts, G.; Stennes, M.; Maddalena, R. J.; Simon, R.; Pospieszalski, M.; Bryerton, E.

    2013-01-01

    The new 4mm receiver (67--93 GHz) for the Robert C. Byrd Green Bank Telescope (GBT) was built to take advantage of the improved surface accuracy of the dish. The low frequency end of the 3mm atmospheric window is not available with ALMA (National Radio Astronomy Observatory is a facility of the National Science Foundation operated under cooperative agreement by Associated Universities, Inc.

  3. A Bayesian approach to matched field processing in uncertain ocean environments

    Institute of Scientific and Technical Information of China (English)

    LI Jianlong; PAN Xiang

    2008-01-01

    An approach of Bayesian Matched Field Processing(MFP)was discussed in the uncertain ocean environment.In this approach,uncertainty knowledge is modeled and spatial and temporal data Received by the array are fully used.Therefore,a mechanism for MFP is found.which well combines model-based and data-driven methods of uncertain field processing.By theoretical derivation,simulation analysis and the validation of the experimental array data at sea,we find that(1)the basic components of Bayesian matched field processors are the corresponding sets of Bartlett matched field processor,MVDR(minimum variance distortionless response)matched field processor,etc.;(2)Bayesian MVDR/Bartlett MFP are the weighted sum of the MVDR/Bartlett MFP,where the weighted coefficients are the values of the a posteriori probability;(3)with the uncertain ocean environment,Bayesian MFP can more correctly locate the source than MVDR MFP or Bartlett MFP;(4)Bayesian MFP call better suppress sidelobes of the ambiguity surfaces.

  4. The Diagnosis of Reciprocating Machinery by Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.

  5. Aggregated Residential Load Modeling Using Dynamic Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Vlachopoulou, Maria; Chin, George; Fuller, Jason C.; Lu, Shuai

    2014-09-28

    Abstract—It is already obvious that the future power grid will have to address higher demand for power and energy, and to incorporate renewable resources of different energy generation patterns. Demand response (DR) schemes could successfully be used to manage and balance power supply and demand under operating conditions of the future power grid. To achieve that, more advanced tools for DR management of operations and planning are necessary that can estimate the available capacity from DR resources. In this research, a Dynamic Bayesian Network (DBN) is derived, trained, and tested that can model aggregated load of Heating, Ventilation, and Air Conditioning (HVAC) systems. DBNs can provide flexible and powerful tools for both operations and planing, due to their unique analytical capabilities. The DBN model accuracy and flexibility of use is demonstrated by testing the model under different operational scenarios.

  6. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  7. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  8. Bayesian Control for Concentrating Mixed Nuclear Waste

    OpenAIRE

    Welch, Robert L.; Smith, Clayton

    2013-01-01

    A control algorithm for batch processing of mixed waste is proposed based on conditional Gaussian Bayesian networks. The network is compiled during batch staging for real-time response to sensor input.

  9. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  10. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  11. Nomograms for Visualization of Naive Bayesian Classifier

    OpenAIRE

    Možina, Martin; Demšar, Janez; Michael W Kattan; Zupan, Blaz

    2004-01-01

    Besides good predictive performance, the naive Bayesian classifier can also offer a valuable insight into the structure of the training data and effects of the attributes on the class probabilities. This structure may be effectively revealed through visualization of the classifier. We propose a new way to visualize the naive Bayesian model in the form of a nomogram. The advantages of the proposed method are simplicity of presentation, clear display of the effects of individual attribute value...

  12. Subjective Bayesian Analysis: Principles and Practice

    OpenAIRE

    Goldstein, Michael

    2006-01-01

    We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.

  13. Bayesian Analysis of Multivariate Probit Models

    OpenAIRE

    Siddhartha Chib; Edward Greenberg

    1996-01-01

    This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...

  14. Fitness inheritance in the Bayesian optimization algorithm

    OpenAIRE

    Pelikan, Martin; Sastry, Kumara

    2004-01-01

    This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions...

  15. Kernel Bayesian Inference with Posterior Regularization

    OpenAIRE

    Song, Yang; Jun ZHU; Ren, Yong

    2016-01-01

    We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former th...

  16. Bayesian Classification in Medicine: The Transferability Question *

    OpenAIRE

    Zagoria, Ronald J.; Reggia, James A.; Price, Thomas R.; Banko, Maryann

    1981-01-01

    Using probabilities derived from a geographically distant patient population, we applied Bayesian classification to categorize stroke patients by etiology. Performance was assessed both by error rate and with a new linear accuracy coefficient. This approach to patient classification was found to be surprisingly accurate when compared to classification by two neurologists and to classification by the Bayesian method using “low cost” local and subjective probabilities. We conclude that for some...

  17. Bayesian target tracking based on particle filter

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.

  18. Bayesian Variable Selection in Spatial Autoregressive Models

    OpenAIRE

    Jesus Crespo Cuaresma; Philipp Piribauer

    2015-01-01

    This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...

  19. Fuzzy Functional Dependencies and Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    LIU WeiYi(刘惟一); SONG Ning(宋宁)

    2003-01-01

    Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.

  20. Bayesian Models of Brain and Behaviour

    OpenAIRE

    Penny, William

    2012-01-01

    This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...

  1. Bayesian Modeling of a Human MMORPG Player

    CERN Document Server

    Synnaeve, Gabriel

    2010-01-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  2. Bayesian Modeling of a Human MMORPG Player

    Science.gov (United States)

    Synnaeve, Gabriel; Bessière, Pierre

    2011-03-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  3. 21 CFR 1020.10 - Television receivers.

    Science.gov (United States)

    2010-04-01

    ... components of the receiver except that portion of the neck and socket of the cathode-ray tube which normally... of a circuit or shield component. The warning label shall include the specification of operating...

  4. Bayesian inference for OPC modeling

    Science.gov (United States)

    Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.

    2016-03-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.

  5. Bayesian analysis of cosmic structures

    CERN Document Server

    Kitaura, Francisco-Shu

    2011-01-01

    We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...

  6. Bayesian analysis of volcanic eruptions

    Science.gov (United States)

    Ho, Chih-Hsiang

    1990-10-01

    The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.

  7. BAYESIAN APPROACH OF DECISION PROBLEMS

    Directory of Open Access Journals (Sweden)

    DRAGOŞ STUPARU

    2010-01-01

    Full Text Available Management is nowadays a basic vector of economic development, a concept frequently used in our country as well as all over the world. Indifferently of the hierarchical level at which the managerial process is manifested, decision represents its essential moment, the supreme act of managerial activity. Its can be met in all fields of activity, practically having an unlimited degree of coverage, and in all the functions of management. It is common knowledge that the activity of any type of manger, no matter the hierarchical level he occupies, represents a chain of interdependent decisions, their aim being the elimination or limitation of the influence of disturbing factors that may endanger the achievement of predetermined objectives, and the quality of managerial decisions condition the progress and viability of any enterprise. Therefore, one of the principal characteristics of a successful manager is his ability to adopt the most optimal decisions of high quality. The quality of managerial decisions are conditioned by the manager’s general level of education and specialization, the manner in which they are preoccupied to assimilate the latest information and innovations in the domain of management’s theory and practice and the applying of modern managerial methods and techniques in the activity of management. We are presenting below the analysis of decision problems in hazardous conditions in terms of Bayesian theory – a theory that uses the probabilistic calculus.

  8. Sensitivity modeling of binary optical receivers.

    Science.gov (United States)

    Giggenbach, Dirk; Mata-Calvo, Ramon

    2015-10-01

    The sensitivity characteristics of optical receiver frontends for high-speed data communications depend on modulation format, detector type, and specific operational constraints. A general mathematical model of the receiver sensitivity that fits to analytical as well as measured data is required to compare different receiver implementations and assess the reliability of data links under varying received power as common in free-space optical communication links. In this paper, a new approach based on Q-factor modeling is presented, compared with analytical receiver models, and applied to a multitude of exemplary receiver implementations. A methodology is introduced to generally apply the model to ideal or practical binary optical receiver frontends. PMID:26479592

  9. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889

  10. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.

  11. 指甲肌酐ROC曲线下面积对急慢性肾衰竭的鉴别诊断%Assessment of receiver operating characteristic curve on diagnostic value of nail creatinine in acute and chronic renal failure

    Institute of Scientific and Technical Information of China (English)

    秦小琪; 李惊子; 王海燕

    2001-01-01

    Objective:To search for diagnostic critical value of nail creatinine (NCr) for acute renal failure (ARF) and chronic renal failure (CRF).Methods:Using receiver operating characteristic (ROC) curve method,we analyzed the diagnostic index for ARF and CRF——diagnostic critical value of NCr.Results:Because of individual distributing overlap of the NCr in ARF and CRF and because of the different determinate value for each selected cut-off point,the sensitivity (Se) and specificity (Sp) might vary,there would be differences in area under the ROC curve.The ROC curve area was 78.9 under the 5 cut-off point,and at peak point of the ROC curve the NCr was 84.9.Conclusions:Because ROC curve method combines Se and Sp to estimate the diagnostic critical value of disease index and determine the veracity by the area under the curve,this method has the practical value for clinical diagnosis when ROC curve area is 0.7-0.9 and NCr was 84.9.%目的:寻找指甲肌酐(NCr)鉴别急慢性肾衰竭的诊断界值。方法:应用ROC曲线的方法评价NCr作为鉴别诊断急慢性肾衰竭指标的界值及其优劣。结果:由于急慢性肾衰竭时NCr个体分布的重叠,选取各截断点下判定值的不同,其灵敏度、特异度发生变化,并使得ROC曲线下的面积不同。本研究所取5个截断点下曲线面积为78.9,NCr取值84.9处为曲线最凸点。结论:ROC曲线的面积在0.7~0.9之间有一定的准确性;以NCr测值84.9作为急慢性肾衰竭NCr的鉴别诊断界值有一定的临床价值。

  12. Ceramic Solar Receiver

    Science.gov (United States)

    Robertson, C., Jr.

    1984-01-01

    Solar receiver uses ceramic honeycomb matrix to absorb heat from Sun and transfer it to working fluid at temperatures of 1,095 degrees and 1,650 degrees C. Drives gas turbine engine or provides heat for industrial processes.

  13. Receiver Gain Modulation Circuit

    Science.gov (United States)

    Jones, Hollis; Racette, Paul; Walker, David; Gu, Dazhen

    2011-01-01

    A receiver gain modulation circuit (RGMC) was developed that modulates the power gain of the output of a radiometer receiver with a test signal. As the radiometer receiver switches between calibration noise references, the test signal is mixed with the calibrated noise and thus produces an ensemble set of measurements from which ensemble statistical analysis can be used to extract statistical information about the test signal. The RGMC is an enabling technology of the ensemble detector. As a key component for achieving ensemble detection and analysis, the RGMC has broad aeronautical and space applications. The RGMC can be used to test and develop new calibration algorithms, for example, to detect gain anomalies, and/or correct for slow drifts that affect climate-quality measurements over an accelerated time scale. A generalized approach to analyzing radiometer system designs yields a mathematical treatment of noise reference measurements in calibration algorithms. By treating the measurements from the different noise references as ensemble samples of the receiver state, i.e. receiver gain, a quantitative description of the non-stationary properties of the underlying receiver fluctuations can be derived. Excellent agreement has been obtained between model calculations and radiometric measurements. The mathematical formulation is equivalent to modulating the gain of a stable receiver with an externally generated signal and is the basis for ensemble detection and analysis (EDA). The concept of generating ensemble data sets using an ensemble detector is similar to the ensemble data sets generated as part of ensemble empirical mode decomposition (EEMD) with exception of a key distinguishing factor. EEMD adds noise to the signal under study whereas EDA mixes the signal with calibrated noise. It is mixing with calibrated noise that permits the measurement of temporal-functional variability of uncertainty in the underlying process. The RGMC permits the evaluation of EDA by

  14. Landslide susceptibility mapping along road corridors in the Indian Himalayas using Bayesian logistic regression models

    Science.gov (United States)

    Das, Iswar; Stein, Alfred; Kerle, Norman; Dadhwal, Vinay K.

    2012-12-01

    Landslide susceptibility mapping (LSM) along road corridors in the Indian Himalayas is an essential exercise that helps planners and decision makers in determining the severity of probable slope failure areas. Logistic regression is commonly applied for this purpose, as it is a robust and straightforward technique that is relatively easy to handle. Ordinary logistic regression as a data-driven technique, however, does not allow inclusion of prior information. This study presents Bayesian logistic regression (BLR) for landslide susceptibility assessment along road corridors. The methodology is tested in a landslide-prone area in the Bhagirathi river valley in the Indian Himalayas. Parameter estimates from BLR are compared with those obtained from ordinary logistic regression. By means of iterative Markov Chain Monte Carlo simulation, BLR provides a rich set of results on parameter estimation. We assessed model performance by the receiver operator characteristics curve analysis, and validated the model using 50% of the landslide cells kept apart for testing and validation. The study concludes that BLR performs better in posterior parameter estimation in general and the uncertainty estimation in particular.

  15. Comparison of Two Gas Selection Methodologies: An Application of Bayesian Model Averaging

    Energy Technology Data Exchange (ETDEWEB)

    Renholds, Andrea S.; Thompson, Sandra E.; Anderson, Kevin K.; Chilton, Lawrence K.

    2006-03-31

    One goal of hyperspectral imagery analysis is the detection and characterization of plumes. Characterization includes identifying the gases in the plumes, which is a model selection problem. Two gas selection methods compared in this report are Bayesian model averaging (BMA) and minimum Akaike information criterion (AIC) stepwise regression (SR). Simulated spectral data from a three-layer radiance transfer model were used to compare the two methods. Test gases were chosen to span the types of spectra observed, which exhibit peaks ranging from broad to sharp. The size and complexity of the search libraries were varied. Background materials were chosen to either replicate a remote area of eastern Washington or feature many common background materials. For many cases, BMA and SR performed the detection task comparably in terms of the receiver operating characteristic curves. For some gases, BMA performed better than SR when the size and complexity of the search library increased. This is encouraging because we expect improved BMA performance upon incorporation of prior information on background materials and gases.

  16. Using Bayesian networks to analyze occupational stress caused by work demands: preventing stress through social support.

    Science.gov (United States)

    García-Herrero, Susana; Mariscal, M A; Gutiérrez, J M; Ritzel, Dale O

    2013-08-01

    Occupational stress is a major health hazard and a serious challenge to the effective operation of any company and represents a major problem for both individuals and organizations. Previous researches have shown that high demands (e.g. workload, emotional) combined with low resources (e.g. support, control, rewards) are associated with adverse health (e.g. psychological, physical) and organizational impacts (e.g. reduced job satisfaction, sickness absence). The objective of the present work is to create a model to analyze how social support reduces the occupational stress caused by work demands. This study used existing Spanish national data on working conditions collected by the Spanish Ministry of Labour and Immigration in 2007, where 11,054 workers were interviewed by questionnaire. A probabilistic model was built using Bayesian networks to explain the relationships between work demands and occupational stress. The model also explains how social support contributes positively to reducing stress levels. The variables studied were intellectually demanding work, overwork, workday, stress, and social support. The results show the importance of social support and of receiving help from supervisors and co-workers in preventing occupational stress. The study provides a new methodology that explains and quantifies the effects of intellectually demanding work, overwork, and workday in occupational stress. Also, the study quantifies the importance of social support to reduce occupational stress.

  17. Constitution and application of reactor make-up system's fault diagnostic Bayesian networks

    International Nuclear Information System (INIS)

    A fault diagnostic Bayesian network of reactor make-up system was constituted. The system's structure characters, operation rules and experts' experience were combined and an initial net was built. As the fault date sets were learned with the particle swarm optimization based Bayesian network structure, the structure of diagnostic net was completed and used to inference case. The built net can analyze diagnostic probability of every node in the net and afford assistant decision to fault diagnosis. (authors)

  18. An introduction to Gaussian Bayesian networks.

    Science.gov (United States)

    Grzegorczyk, Marco

    2010-01-01

    The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain. PMID:20824469

  19. Bayesian calibration of power plant models for accurate performance prediction

    International Nuclear Information System (INIS)

    Highlights: • Bayesian calibration is applied to power plant performance prediction. • Measurements from a plant in operation are used for model calibration. • A gas turbine performance model and steam cycle model are calibrated. • An integrated plant model is derived. • Part load efficiency is accurately predicted as a function of ambient conditions. - Abstract: Gas turbine combined cycles are expected to play an increasingly important role in the balancing of supply and demand in future energy markets. Thermodynamic modeling of these energy systems is frequently applied to assist in decision making processes related to the management of plant operation and maintenance. In most cases, model inputs, parameters and outputs are treated as deterministic quantities and plant operators make decisions with limited or no regard of uncertainties. As the steady integration of wind and solar energy into the energy market induces extra uncertainties, part load operation and reliability are becoming increasingly important. In the current study, methods are proposed to not only quantify various types of uncertainties in measurements and plant model parameters using measured data, but to also assess their effect on various aspects of performance prediction. The authors aim to account for model parameter and measurement uncertainty, and for systematic discrepancy of models with respect to reality. For this purpose, the Bayesian calibration framework of Kennedy and O’Hagan is used, which is especially suitable for high-dimensional industrial problems. The article derives a calibrated model of the plant efficiency as a function of ambient conditions and operational parameters, which is also accurate in part load. The article shows that complete statistical modeling of power plants not only enhances process models, but can also increases confidence in operational decisions

  20. Bayesian joint modeling of longitudinal and spatial survival AIDS data.

    Science.gov (United States)

    Martins, Rui; Silva, Giovani L; Andreozzi, Valeska

    2016-08-30

    Joint analysis of longitudinal and survival data has received increasing attention in the recent years, especially for analyzing cancer and AIDS data. As both repeated measurements (longitudinal) and time-to-event (survival) outcomes are observed in an individual, a joint modeling is more appropriate because it takes into account the dependence between the two types of responses, which are often analyzed separately. We propose a Bayesian hierarchical model for jointly modeling longitudinal and survival data considering functional time and spatial frailty effects, respectively. That is, the proposed model deals with non-linear longitudinal effects and spatial survival effects accounting for the unobserved heterogeneity among individuals living in the same region. This joint approach is applied to a cohort study of patients with HIV/AIDS in Brazil during the years 2002-2006. Our Bayesian joint model presents considerable improvements in the estimation of survival times of the Brazilian HIV/AIDS patients when compared with those obtained through a separate survival model and shows that the spatial risk of death is the same across the different Brazilian states. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26990773

  1. Mobile sensor network noise reduction and recalibration using a Bayesian network

    Science.gov (United States)

    Xiang, Y.; Tang, Y.; Zhu, W.

    2016-02-01

    People are becoming increasingly interested in mobile air quality sensor network applications. By eliminating the inaccuracies caused by spatial and temporal heterogeneity of pollutant distributions, this method shows great potential for atmospheric research. However, systems based on low-cost air quality sensors often suffer from sensor noise and drift. For the sensing systems to operate stably and reliably in real-world applications, those problems must be addressed. In this work, we exploit the correlation of different types of sensors caused by cross sensitivity to help identify and correct the outlier readings. By employing a Bayesian network based system, we are able to recover the erroneous readings and recalibrate the drifted sensors simultaneously. Our method improves upon the state-of-art Bayesian belief network techniques by incorporating the virtual evidence and adjusting the sensor calibration functions recursively.Specifically, we have (1) designed a system based on the Bayesian belief network to detect and recover the abnormal readings, (2) developed methods to update the sensor calibration functions infield without requirement of ground truth, and (3) extended the Bayesian network with virtual evidence for infield sensor recalibration. To validate our technique, we have tested our technique with metal oxide sensors measuring NO2, CO, and O3 in a real-world deployment. Compared with the existing Bayesian belief network techniques, results based on our experiment setup demonstrate that our system can reduce error by 34.1 % and recover 4 times more data on average.

  2. Forecasting the 2012 and 2014 Elections Using Bayesian Prediction and Optimization

    Directory of Open Access Journals (Sweden)

    Steven E. Rigdon

    2015-04-01

    Full Text Available This article presents a data-driven Bayesian model used to predict the state-by-state winners in the Senate and presidential elections in 2012 and 2014. The Bayesian model takes into account the proportions of polled subjects who favor each candidate and the proportion who are undecided, and produces a posterior probability that each candidate will win each state. From this, a dynamic programming algorithm is used to compute the probability mass functions for the number of electoral votes that each presidential candidate receives and the number of Senate seats that each party receives. On the final day before the 2012 election, the model gave a probability of (essentially one that President Obama would be reelected, and that the Democrats would retain control of the U.S. Senate. In 2014, the model gave a final probability of .99 that the Republicans would take control of the Senate.

  3. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574

  4. Dimensionality reduction in Bayesian estimation algorithms

    Directory of Open Access Journals (Sweden)

    G. W. Petty

    2013-03-01

    Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  5. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.

  6. Risk-Based Operation and Maintenance Using Bayesian Networks

    DEFF Research Database (Denmark)

    Nielsen, Jannie Jessen; Sørensen, John Dalsgaard

    2011-01-01

    This paper describes how risk-based decision making can be used for maintenance planning of components exposed to degradation such as fatigue in offshore wind turbines. In fatigue models, large epistemic uncertainties are usually present. These can be reduced if monitoring results are used...

  7. Central solar energy receiver

    Science.gov (United States)

    Drost, M. Kevin

    1983-01-01

    An improved tower-mounted central solar energy receiver for heating air drawn through the receiver by an induced draft fan. A number of vertically oriented, energy absorbing, fin-shaped slats are radially arranged in a number of concentric cylindrical arrays on top of the tower coaxially surrounding a pipe having air holes through which the fan draws air which is heated by the slats which receive the solar radiation from a heliostat field. A number of vertically oriented and wedge-shaped columns are radially arranged in a number of concentric cylindrical clusters surrounding the slat arrays. The columns have two mirror-reflecting sides to reflect radiation into the slat arrays and one energy absorbing side to reduce reradiation and reflection from the slat arrays.

  8. Online Variational Bayesian Filtering-Based Mobile Target Tracking in Wireless Sensor Networks

    OpenAIRE

    Bingpeng Zhou; Qingchun Chen; Tiffany Jing Li; Pei Xiao

    2014-01-01

    The received signal strength (RSS)-based online tracking for a mobile node in wireless sensor networks (WSNs) is investigated in this paper. Firstly, a multi-layer dynamic Bayesian network (MDBN) is introduced to characterize the target mobility with either directional or undirected movement. In particular, it is proposed to employ the Wishart distribution to approximate the time-varying RSS measurement precision's randomness due to the target movement. It is shown that the proposed MDBN offe...

  9. Bayesball: A Bayesian hierarchical model for evaluating fielding in major league baseball

    OpenAIRE

    Jensen, Shane T.; Shirley, Kenneth E.; Wyner, Abraham J.

    2008-01-01

    The use of statistical modeling in baseball has received substantial attention recently in both the media and academic community. We focus on a relatively under-explored topic: the use of statistical models for the analysis of fielding based on high-resolution data consisting of on-field location of batted balls. We combine spatial modeling with a hierarchical Bayesian structure in order to evaluate the performance of individual fielders while sharing information between fielders at each posi...

  10. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...

  11. Adaptive approximate Bayesian computation for complex models

    CERN Document Server

    Lenormand, Maxime; Deffuant, Guillaume

    2011-01-01

    Approximate Bayesian computation (ABC) is a family of computational techniques in Bayesian statistics. These techniques allow to fit a model to data without relying on the computation of the model likelihood. They instead require to simulate a large number of times the model to be fitted. A number of refinements to the original rejection-based ABC scheme have been proposed, including the sequential improvement of posterior distributions. This technique allows to decrease the number of model simulations required, but it still presents several shortcomings which are particularly problematic for costly to simulate complex models. We here provide a new algorithm to perform adaptive approximate Bayesian computation, which is shown to perform better on both a toy example and a complex social model.

  12. Learning Bayesian Networks from Correlated Data

    Science.gov (United States)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola

    2016-05-01

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  13. Bayesian Fusion of Multi-Band Images

    CERN Document Server

    Wei, Qi; Tourneret, Jean-Yves

    2013-01-01

    In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...

  14. Bayesian Image Reconstruction Based on Voronoi Diagrams

    CERN Document Server

    Cabrera, G F; Hitschfeld, N

    2007-01-01

    We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.

  15. Dynamic Bayesian Combination of Multiple Imperfect Classifiers

    CERN Document Server

    Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon

    2012-01-01

    Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...

  16. Bayesian inference of the metazoan phylogeny

    DEFF Research Database (Denmark)

    Glenner, Henrik; Hansen, Anders J; Sørensen, Martin V;

    2004-01-01

    been the only feasible combined approach but is highly sensitive to long-branch attraction. Recent development of stochastic models for discrete morphological characters and computationally efficient methods for Bayesian inference has enabled combined molecular and morphological data analysis...... with rigorous statistical approaches less prone to such inconsistencies. We present the first statistically founded analysis of a metazoan data set based on a combination of morphological and molecular data and compare the results with a traditional parsimony analysis. Interestingly, the Bayesian analyses...... such as the ecdysozoans and lophotrochozoans. Parsimony, on the contrary, shows conflicting results, with morphology being congruent to the Bayesian results and the molecular data set producing peculiarities that are largely reflected in the combined analysis....

  17. Variational Bayesian Inference of Line Spectra

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri

    2016-01-01

    In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs...

  18. Event generator tuning using Bayesian optimization

    CERN Document Server

    Ilten, Philip; Yang, Yunjie

    2016-01-01

    Monte Carlo event generators contain a large number of parameters that must be determined by comparing the output of the generator with experimental data. Generating enough events with a fixed set of parameter values to enable making such a comparison is extremely CPU intensive, which prohibits performing a simple brute-force grid-based tuning of the parameters. Bayesian optimization is a powerful method designed for such black-box tuning applications. In this article, we show that Monte Carlo event generator parameters can be accurately obtained using Bayesian optimization and minimal expert-level physics knowledge. A tune of the PYTHIA 8 event generator using $e^+e^-$ events, where 20 parameters are optimized, can be run on a modern laptop in just two days. Combining the Bayesian optimization approach with expert knowledge should enable producing better tunes in the future, by making it faster and easier to study discrepancies between Monte Carlo and experimental data.

  19. Hessian PDF reweighting meets the Bayesian methods

    CERN Document Server

    Paukkunen, Hannu

    2014-01-01

    We discuss the Hessian PDF reweighting - a technique intended to estimate the effects that new measurements have on a set of PDFs. The method stems straightforwardly from considering new data in a usual $\\chi^2$-fit and it naturally incorporates also non-zero values for the tolerance, $\\Delta\\chi^2>1$. In comparison to the contemporary Bayesian reweighting techniques, there is no need to generate large ensembles of PDF Monte-Carlo replicas, and the observables need to be evaluated only with the central and the error sets of the original PDFs. In spite of the apparently rather different methodologies, we find that the Hessian and the Bayesian techniques are actually equivalent if the $\\Delta\\chi^2$ criterion is properly included to the Bayesian likelihood function that is a simple exponential.

  20. A Large Sample Study of the Bayesian Bootstrap

    OpenAIRE

    Lo, Albert Y.

    1987-01-01

    An asymptotic justification of the Bayesian bootstrap is given. Large-sample Bayesian bootstrap probability intervals for the mean, the variance and bands for the distribution, the smoothed density and smoothed rate function are also provided.

  1. Zero-power receiver

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert W.

    2016-10-04

    An unpowered signal receiver and a method for signal reception detects and responds to very weak signals using pyroelectric devices as impedance transformers and/or demodulators. In some embodiments, surface acoustic wave devices (SAW) are also used. Illustrative embodiments include satellite and long distance terrestrial communications applications.

  2. Inferring on the intentions of others by hierarchical Bayesian learning.

    Directory of Open Access Journals (Sweden)

    Andreea O Diaconescu

    2014-09-01

    Full Text Available Inferring on others' (potentially time-varying intentions is a fundamental problem during many social transactions. To investigate the underlying mechanisms, we applied computational modeling to behavioral data from an economic game in which 16 pairs of volunteers (randomly assigned to "player" or "adviser" roles interacted. The player performed a probabilistic reinforcement learning task, receiving information about a binary lottery from a visual pie chart. The adviser, who received more predictive information, issued an additional recommendation. Critically, the game was structured such that the adviser's incentives to provide helpful or misleading information varied in time. Using a meta-Bayesian modeling framework, we found that the players' behavior was best explained by the deployment of hierarchical learning: they inferred upon the volatility of the advisers' intentions in order to optimize their predictions about the validity of their advice. Beyond learning, volatility estimates also affected the trial-by-trial variability of decisions: participants were more likely to rely on their estimates of advice accuracy for making choices when they believed that the adviser's intentions were presently stable. Finally, our model of the players' inference predicted the players' interpersonal reactivity index (IRI scores, explicit ratings of the advisers' helpfulness and the advisers' self-reports on their chosen strategy. Overall, our results suggest that humans (i employ hierarchical generative models to infer on the changing intentions of others, (ii use volatility estimates to inform decision-making in social interactions, and (iii integrate estimates of advice accuracy with non-social sources of information. The Bayesian framework presented here can quantify individual differences in these mechanisms from simple behavioral readouts and may prove useful in future clinical studies of maladaptive social cognition.

  3. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  4. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  5. Bayesian Optimisation Algorithm for Nurse Scheduling

    CERN Document Server

    Li, Jingpeng

    2008-01-01

    Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurses assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.

  6. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  7. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;

    2011-01-01

    We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....

  8. Comparison of the Bayesian and Frequentist Approach to the Statistics

    OpenAIRE

    Hakala, Michal

    2015-01-01

    The Thesis deals with introduction to Bayesian statistics and comparing Bayesian approach with frequentist approach to statistics. Bayesian statistics is modern branch of statistics which provides an alternative comprehensive theory to the frequentist approach. Bayesian concepts provides solution for problems not being solvable by frequentist theory. In the thesis are compared definitions, concepts and quality of statistical inference. The main interest is focused on a point estimation, an in...

  9. Flood alert system based on bayesian techniques

    Science.gov (United States)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  10. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...

  11. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    R. Wetzels; R.P.P.P. Grasman; E.J. Wagenmakers

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA desig

  12. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t

  13. Bayesian Just-So Stories in Psychology and Neuroscience

    Science.gov (United States)

    Bowers, Jeffrey S.; Davis, Colin J.

    2012-01-01

    According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…

  14. Bayesian analysis of censored response data in family-based genetic association studies.

    Science.gov (United States)

    Del Greco M, Fabiola; Pattaro, Cristian; Minelli, Cosetta; Thompson, John R

    2016-09-01

    Biomarkers are subject to censoring whenever some measurements are not quantifiable given a laboratory detection limit. Methods for handling censoring have received less attention in genetic epidemiology, and censored data are still often replaced with a fixed value. We compared different strategies for handling a left-censored continuous biomarker in a family-based study, where the biomarker is tested for association with a genetic variant, S, adjusting for a covariate, X. Allowing different correlations between X and S, we compared simple substitution of censored observations with the detection limit followed by a linear mixed effect model (LMM), Bayesian model with noninformative priors, Tobit model with robust standard errors, the multiple imputation (MI) with and without S in the imputation followed by a LMM. Our comparison was based on real and simulated data in which 20% and 40% censoring were artificially induced. The complete data were also analyzed with a LMM. In the MICROS study, the Bayesian model gave results closer to those obtained with the complete data. In the simulations, simple substitution was always the most biased method, the Tobit approach gave the least biased estimates at all censoring levels and correlation values, the Bayesian model and both MI approaches gave slightly biased estimates but smaller root mean square errors. On the basis of these results the Bayesian approach is highly recommended for candidate gene studies; however, the computationally simpler Tobit and the MI without S are both good options for genome-wide studies.

  15. Bayesian Approach for Reliability Assessment of Sunshield Deployment on JWST

    Science.gov (United States)

    Kaminskiy, Mark P.; Evans, John W.; Gallo, Luis D.

    2013-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.

  16. Most frugal explanations in Bayesian networks

    NARCIS (Netherlands)

    Kwisthout, J.H.P.

    2015-01-01

    Inferring the most probable explanation to a set of variables, given a partial observation of the remaining variables, is one of the canonical computational problems in Bayesian networks, with widespread applications in AI and beyond. This problem, known as MAP, is computationally intractable (NP-ha

  17. Bayesian semiparametric dynamic Nelson-Siegel model

    NARCIS (Netherlands)

    C. Cakmakli

    2011-01-01

    This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Diric

  18. Von Neumann was not a Quantum Bayesian.

    Science.gov (United States)

    Stacey, Blake C

    2016-05-28

    Wikipedia has claimed for over 3 years now that John von Neumann was the 'first quantum Bayesian'. In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported. PMID:27091166

  19. Von Neumann Was Not a Quantum Bayesian

    OpenAIRE

    Blake C. Stacey

    2014-01-01

    Wikipedia has claimed for over three years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.

  20. A Bayesian Approach to Interactive Retrieval

    Science.gov (United States)

    Tague, Jean M.

    1973-01-01

    A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…

  1. Bayesian regularization of diffusion tensor images

    DEFF Research Database (Denmark)

    Frandsen, Jesper; Hobolth, Asger; Østergaard, Leif;

    2007-01-01

    several directions. The measured diffusion coefficients and thereby the diffusion tensors are subject to noise, leading to possibly flawed representations of the three dimensional fibre bundles. In this paper we develop a Bayesian procedure for regularizing the diffusion tensor field, fully utilizing...

  2. Inverse Problems in a Bayesian Setting

    KAUST Repository

    Matthies, Hermann G.

    2016-02-13

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.

  3. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.

  4. Bayesian Vector Autoregressions with Stochastic Volatility

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1996-01-01

    This paper proposes a Bayesian approach to a vector autoregression with stochastic volatility, where the multiplicative evolution of the precision matrix is driven by a multivariate beta variate.Exact updating formulas are given to the nonlinear filtering of the precision matrix.Estimation of the au

  5. Scaling Bayesian network discovery through incremental recovery

    NARCIS (Netherlands)

    Castelo, J.R.; Siebes, A.P.J.M.

    1999-01-01

    Bayesian networks are a type of graphical models that, e.g., allow one to analyze the interaction among the variables in a database. A well-known problem with the discovery of such models from a database is the ``problem of high-dimensionality''. That is, the discovery of a network from a database w

  6. A Bayesian Bootstrap for a Finite Population

    OpenAIRE

    Lo, Albert Y.

    1988-01-01

    A Bayesian bootstrap for a finite population is introduced; its small-sample distributional properties are discussed and compared with those of the frequentist bootstrap for a finite population. It is also shown that the two are first-order asymptotically equivalent.

  7. Bayesian calibration for forensic age estimation.

    Science.gov (United States)

    Ferrante, Luigi; Skrami, Edlira; Gesuita, Rosaria; Cameriere, Roberto

    2015-05-10

    Forensic medicine is increasingly called upon to assess the age of individuals. Forensic age estimation is mostly required in relation to illegal immigration and identification of bodies or skeletal remains. A variety of age estimation methods are based on dental samples and use of regression models, where the age of an individual is predicted by morphological tooth changes that take place over time. From the medico-legal point of view, regression models, with age as the dependent random variable entail that age tends to be overestimated in the young and underestimated in the old. To overcome this bias, we describe a new full Bayesian calibration method (asymmetric Laplace Bayesian calibration) for forensic age estimation that uses asymmetric Laplace distribution as the probability model. The method was compared with three existing approaches (two Bayesian and a classical method) using simulated data. Although its accuracy was comparable with that of the other methods, the asymmetric Laplace Bayesian calibration appears to be significantly more reliable and robust in case of misspecification of the probability model. The proposed method was also applied to a real dataset of values of the pulp chamber of the right lower premolar measured on x-ray scans of individuals of known age. PMID:25645903

  8. Exploiting structure in cooperative Bayesian games

    NARCIS (Netherlands)

    F.A. Oliehoek; S. Whiteson; M.T.J. Spaan

    2012-01-01

    Cooperative Bayesian games (BGs) can model decision-making problems for teams of agents under imperfect information, but require space and computation time that is exponential in the number of agents. While agent independence has been used to mitigate these problems in perfect information settings,

  9. Perfect Bayesian equilibrium. Part II: epistemic foundations

    OpenAIRE

    Bonanno, Giacomo

    2011-01-01

    In a companion paper we introduced a general notion of perfect Bayesian equilibrium which can be applied to arbitrary extensive-form games. The essential ingredient of the proposed definition is the qualitative notion of AGM-consistency. In this paper we provide an epistemic foundation for AGM-consistency based on the AGM theory of belief revision.

  10. Decision generation tools and Bayesian inference

    Science.gov (United States)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  11. Von Neumann Was Not a Quantum Bayesian

    CERN Document Server

    Stacey, Blake C

    2014-01-01

    Wikipedia has claimed for over two years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.

  12. Bayesian calibration of car-following models

    NARCIS (Netherlands)

    Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.

    2010-01-01

    Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p

  13. Basics of Bayesian Learning - Basically Bayes

    DEFF Research Database (Denmark)

    Larsen, Jan

    Tutorial presented at the IEEE Machine Learning for Signal Processing Workshop 2006, Maynooth, Ireland, September 8, 2006. The tutorial focuses on the basic elements of Bayesian learning and its relation to classical learning paradigms. This includes a critical discussion of the pros and cons...

  14. On local optima in learning bayesian networks

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Kocka, Tomas; Pena, Jose

    2003-01-01

    This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness...

  15. Bayesian Estimation Supersedes the "t" Test

    Science.gov (United States)

    Kruschke, John K.

    2013-01-01

    Bayesian estimation for 2 groups provides complete distributions of credible values for the effect size, group means and their difference, standard deviations and their difference, and the normality of the data. The method handles outliers. The decision rule can accept the null value (unlike traditional "t" tests) when certainty in the estimate is…

  16. Bayesian Estimation of Thermonuclear Reaction Rates

    CERN Document Server

    Iliadis, Christian; Coc, Alain; Timmes, Frank; Starrfield, Sumner

    2016-01-01

    The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied in the past to this problem, all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extra-solar planets, gravitational waves, and type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present the first astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the d(p,$\\gamma$)$^3$He, $^3$He($^3$He,2p)$^4$He, and $^3$He($\\alpha$,$\\gamma$)$^7$Be reactions,...

  17. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...

  18. Bayesian Averaging is Well-Temperated

    DEFF Research Database (Denmark)

    Hansen, Lars Kai

    2000-01-01

    Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation...

  19. Modelling crime linkage with Bayesian networks

    NARCIS (Netherlands)

    J. de Zoete; M. Sjerps; D. Lagnado; N. Fenton

    2015-01-01

    When two or more crimes show specific similarities, such as a very distinct modus operandi, the probability that they were committed by the same offender becomes of interest. This probability depends on the degree of similarity and distinctiveness. We show how Bayesian networks can be used to model

  20. E-Bayesian Estimation of the Products Reliability when Testing Reveals no Failure%当试验没有观察到失效时产品可靠度的E-Bayes估计

    Institute of Scientific and Technical Information of China (English)

    韩明

    2009-01-01

    This paper develops a new method,named E-Bayesian estimation method,to estimate the reliability parameters.The E-Bayesian estimation method of the reliability are derived for the zero-failure data from the product with Binomial distribution.Firstly,for the product reliability,the definitions of E-Bayesian estimation were given,and on the base,expressions of the E-Bayesian estimation and hierarchical Bayesian estimation of the products reliability was given.Secondly,discuss properties of the E-Bayesian estimation.Finally,the new method is applied to a real zero-failure data set,and as can be seen,it is both efficient and easy to operate.

  1. Development of a Bayesian model to estimate health care outcomes in the severely wounded

    Directory of Open Access Journals (Sweden)

    Alexander Stojadinovic

    2010-08-01

    Full Text Available Alexander Stojadinovic1, John Eberhardt2, Trevor S Brown3, Jason S Hawksworth4, Frederick Gage3, Douglas K Tadaki3, Jonathan A Forsberg5, Thomas A Davis3, Benjamin K Potter5, James R Dunne6, E A Elster31Combat Wound Initiative Program, 4Department of Surgery, Walter Reed Army Medical Center, Washington, DC, USA; 2DecisionQ Corporation, Washington, DC, USA; 3Regenerative Medicine Department, Combat Casualty Care, Naval Medical Research Center, Silver Spring, MD, USA; 5Integrated Department of Orthopaedics and Rehabilitation, 6Department of Surgery, National Naval Medical Center, Bethesda, MD, USABackground: Graphical probabilistic models have the ability to provide insights as to how clinical factors are conditionally related. These models can be used to help us understand factors influencing health care outcomes and resource utilization, and to estimate morbidity and clinical outcomes in trauma patient populations.Study design: Thirty-two combat casualties with severe extremity injuries enrolled in a prospective observational study were analyzed using step-wise machine-learned Bayesian belief network (BBN and step-wise logistic regression (LR. Models were evaluated using 10-fold cross-validation to calculate area-under-the-curve (AUC from receiver operating characteristics (ROC curves.Results: Our BBN showed important associations between various factors in our data set that could not be developed using standard regression methods. Cross-validated ROC curve analysis showed that our BBN model was a robust representation of our data domain and that LR models trained on these findings were also robust: hospital-acquired infection (AUC: LR, 0.81; BBN, 0.79, intensive care unit length of stay (AUC: LR, 0.97; BBN, 0.81, and wound healing (AUC: LR, 0.91; BBN, 0.72 showed strong AUC.Conclusions: A BBN model can effectively represent clinical outcomes and biomarkers in patients hospitalized after severe wounding, and is confirmed by 10-fold

  2. Ceramic high temperature receiver design and tests

    Science.gov (United States)

    Davis, S. B.

    1982-01-01

    The High Temperature Solar Thermal Receiver, which was tested a Edwards AFB, CA during the winter of 1980-1981, evolved from technologies developed over a five year period of work. This receiver was tested at the Army Solar Furnace at White Sands, NM in 1976. The receiver, was tested successfully at 1768 deg F and showed thermal efficiencies of 85%. The results were sufficiently promising to lead ERDA to fund our development and test of a 250 kW receiver to measure the efficiency of an open cavity receiver atop a central tower of a heliostat field. This receiver was required to be design scalable to 10, 50, and 100 MW-electric sizes to show applicability to central power tower receivers. That receiver employed rectagular silicon carbide panels and vertical stanchions to achieve scalability. The construction was shown to be fully scalable; and the receiver was operated at temperatures up to 2000 deg F to achieve the performance goals of the experiment during tests at the GIT advanced components test facility during the fall of 1978.

  3. A handbook for solar central receiver design

    Energy Technology Data Exchange (ETDEWEB)

    Falcone, P.K.

    1986-12-01

    This Handbook describes central receiver technology for solar thermal power plants. It contains a description and assessment of the major components in a central receiver system configured for utility scale production of electricity using Rankine-cycle steam turbines. It also describes procedures to size and optimize a plant and discussed examples from recent system analyses. Information concerning site selection criteria, cost estimation, construction, and operation and maintenance is also included, which should enable readers to perform design analyses for specific applications.

  4. Computational statistics using the Bayesian Inference Engine

    Science.gov (United States)

    Weinberg, Martin D.

    2013-09-01

    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.

  5. Digital Receiver Phase Meter

    Science.gov (United States)

    Marcin, Martin; Abramovici, Alexander

    2008-01-01

    The software of a commercially available digital radio receiver has been modified to make the receiver function as a two-channel low-noise phase meter. This phase meter is a prototype in the continuing development of a phase meter for a system in which radiofrequency (RF) signals in the two channels would be outputs of a spaceborne heterodyne laser interferometer for detecting gravitational waves. The frequencies of the signals could include a common Doppler-shift component of as much as 15 MHz. The phase meter is required to measure the relative phases of the signals in the two channels at a sampling rate of 10 Hz at a root power spectral density digital receiver. The input RF signal is first fed to the input terminal of an analog-to-digital converter (ADC). To prevent aliasing errors in the ADC, the sampling rate must be at least twice the input signal frequency. The sampling rate of the ADC is governed by a sampling clock, which also drives a digital local oscillator (DLO), which is a direct digital frequency synthesizer. The DLO produces samples of sine and cosine signals at a programmed tuning frequency. The sine and cosine samples are mixed with (that is, multiplied by) the samples from the ADC, then low-pass filtered to obtain in-phase (I) and quadrature (Q) signal components. A digital signal processor (DSP) computes the ratio between the Q and I components, computes the phase of the RF signal (relative to that of the DLO signal) as the arctangent of this ratio, and then averages successive such phase values over a time interval specified by the user.

  6. Pressure difference receiving ears

    DEFF Research Database (Denmark)

    Michelsen, Axel; Larsen, Ole Næsbye

    2007-01-01

    of such pressure difference receiving ears have been hampered by lack of suitable experimental methods. In this review, we review the methods for collecting reliable data on the binaural directional cues at the eardrums, on how the eardrum vibrations depend on the direction of sound incidence, and on how sound...... waves behave in the air spaces leading to the interior surfaces of eardrums. A linear mathematical model with well-defined inputs is used for exploring how the directionality varies with the binaural directional cues and the amplitude and phase gain of the sound pathway to the inner surface...

  7. Adaptive antennas and receivers

    CERN Document Server

    Weiner, Melvin M

    2005-01-01

    In our modern age of remote sensing, wireless communication, and the nearly endless list of other antenna-based applications, complex problems require increasingly sophisticated solutions. Conventional antenna systems are no longer suited to high-noise or low-signal applications such as intrusion detection. Detailing highly effective approaches to non-Gaussian weak signal detection, Adaptive Antennas and Receivers provides an authoritative introduction to state-of-the-art research on the modeling, testing, and application of these technologies.Edited by innovative researcher and eminent expert

  8. Universal Darwinism as a process of Bayesian inference

    CERN Document Server

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment". Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description clo...

  9. Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion

    Science.gov (United States)

    Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.

    2015-12-01

    Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.

  10. Bayesian network learning for natural hazard assessments

    Science.gov (United States)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  11. Cognitive Connected Vehicle Information System Design Requirement for Safety: Role of Bayesian Artificial Intelligence

    Directory of Open Access Journals (Sweden)

    Ata Khan

    2013-04-01

    Full Text Available Intelligent transportation systems (ITS are gaining acceptance around the world and the connected vehicle component of ITS is recognized as a high priority research and development area in many technologically advanced countries. Connected vehicles are expected to have the capability of safe, efficient and eco-driving operations whether these are under human control or in the adaptive machine control mode of operations. The race is on to design the capability to operate in connected traffic environment. The operational requirements can be met with cognitive vehicle design features made possible by advances in artificial intelligence-supported methodology, improved understanding of human factors, and advances in communication technology. This paper describes cognitive features and their information system requirements. The architecture of an information system is presented that supports the features of the cognitive connected vehicle. For better focus, information processing capabilities are specified and the role of Bayesian artificial intelligence is defined for data fusion. Example applications illustrate the role of information systems in integrating intelligent technology, Bayesian artificial intelligence, and abstracted human factors. Concluding remarks highlight the role of the information system and Bayesian artificial intelligence in the design of a new generation of cognitive connected vehicle.

  12. Bayesian evidence and predictivity of the inflationary paradigm

    Science.gov (United States)

    Gubitosi, Giulia; Lagos, Macarena; Magueijo, João; Allison, Rupert

    2016-06-01

    In this paper we consider the issue of paradigm evaluation by applying Bayes' theorem along the following nested hierarchy of progressively more complex structures: i) parameter estimation (within a model), ii) model selection and comparison (within a paradigm), iii) paradigm evaluation. In such a hierarchy the Bayesian evidence works both as the posterior's normalization at a given level and as the likelihood function at the next level up. Whilst raising no objections to the standard application of the procedure at the two lowest levels, we argue that it should receive a considerable modification when evaluating paradigms, when testability and fitting data are equally important. By considering toy models we illustrate how models and paradigms that are difficult to falsify are always favoured by the Bayes factor. We argue that the evidence for a paradigm should not only be high for a given dataset, but exceptional with respect to what it would have been, had the data been different. With this motivation we propose a measure which we term predictivity, as well as a prior to be incorporated into the Bayesian framework, penalising unpredictivity as much as not fitting data. We apply this measure to inflation seen as a whole, and to a scenario where a specific inflationary model is hypothetically deemed as the only one viable as a result of information alien to cosmology (e.g. Solar System gravity experiments, or particle physics input). We conclude that cosmic inflation is currently hard to falsify, but that this could change were external/additional information to cosmology to select one of its many models. We also compare this state of affairs to bimetric varying speed of light cosmology.

  13. Bayesian statistics for the calibration of the LISA Pathfinder experiment

    Science.gov (United States)

    Armano, M.; Audley, H.; Auger, G.; Binetruy, P.; Born, M.; Bortoluzzi, D.; Brandt, N.; Bursi, A.; Caleno, M.; Cavalleri, A.; Cesarini, A.; Cruise, M.; Danzmann, K.; Diepholz, I.; Dolesi, R.; Dunbar, N.; Ferraioli, L.; Ferroni, V.; Fitzsimons, E.; Freschi, M.; García Marirrodriga, C.; Gerndt, R.; Gesa, L.; Gibert, F.; Giardini, D.; Giusteri, R.; Grimani, C.; Harrison, I.; Heinzel, G.; Hewitson, M.; Hollington, D.; Hueller, M.; Huesler, J.; Inchauspé, H.; Jennrich, O.; Jetzer, P.; Johlander, B.; Karnesis, N.; Kaune, B.; Korsakova, N.; Killow, C.; Lloro, I.; Maarschalkerweerd, R.; Madden, S.; Mance, D.; Martin, V.; Martin-Porqueras, F.; Mateos, I.; McNamara, P.; Mendes, J.; Mitchell, E.; Moroni, A.; Nofrarias, M.; Paczkowski, S.; Perreur-Lloyd, M.; Pivato, P.; Plagnol, E.; Prat, P.; Ragnit, U.; Ramos-Castro, J.; Reiche, J.; Romera Perez, J. A.; Robertson, D.; Rozemeijer, H.; Russano, G.; Sarra, P.; Schleicher, A.; Slutsky, J.; Sopuerta, C. F.; Sumner, T.; Texier, D.; Thorpe, J.; Trenkel, C.; Tu, H. B.; Vitale, S.; Wanner, G.; Ward, H.; Waschke, S.; Wass, P.; Wealthy, D.; Wen, S.; Weber, W.; Wittchen, A.; Zanoni, C.; Ziegler, T.; Zweifel, P.

    2015-05-01

    The main goal of LISA Pathfinder (LPF) mission is to estimate the acceleration noise models of the overall LISA Technology Package (LTP) experiment on-board. This will be of crucial importance for the future space-based Gravitational-Wave (GW) detectors, like eLISA. Here, we present the Bayesian analysis framework to process the planned system identification experiments designed for that purpose. In particular, we focus on the analysis strategies to predict the accuracy of the parameters that describe the system in all degrees of freedom. The data sets were generated during the latest operational simulations organised by the data analysis team and this work is part of the LTPDA Matlab toolbox.

  14. Bayesian Parameter Estimation for Latent Markov Random Fields and Social Networks

    CERN Document Server

    Everitt, Richard G

    2012-01-01

    Undirected graphical models are widely used in statistics, physics and machine vision. However Bayesian parameter estimation for undirected models is extremely challenging, since evaluation of the posterior typically involves the calculation of an intractable normalising constant. This problem has received much attention, but very little of this has focussed on the important practical case where the data consists of noisy or incomplete observations of the underlying hidden structure. This paper specifically addresses this problem, comparing two alternative methodologies. In the first of these approaches particle Markov chain Monte Carlo (Andrieu et al., 2010) is used to efficiently explore the parameter space, combined with the exchange algorithm (Murray et al., 2006) for avoiding the calculation of the intractable normalising constant (a proof showing that this combination targets the correct distribution in found in a supplementary appendix online). This approach is compared with approximate Bayesian comput...

  15. Landslide hazards mapping using uncertain Naïve Bayesian classification method

    Institute of Scientific and Technical Information of China (English)

    毛伊敏; 张茂省; 王根龙; 孙萍萍

    2015-01-01

    Landslide hazard mapping is a fundamental tool for disaster management activities in Loess terrains. Aiming at major issues with these landslide hazard assessment methods based on Naïve Bayesian classification technique, which is difficult in quantifying those uncertain triggering factors, the main purpose of this work is to evaluate the predictive power of landslide spatial models based on uncertain Naïve Bayesian classification method in Baota district of Yan’an city in Shaanxi province, China. Firstly, thematic maps representing various factors that are related to landslide activity were generated. Secondly, by using field data and GIS techniques, a landslide hazard map was performed. To improve the accuracy of the resulting landslide hazard map, the strategies were designed, which quantified the uncertain triggering factor to design landslide spatial models based on uncertain Naïve Bayesian classification method named NBU algorithm. The accuracies of the area under relative operating characteristics curves (AUC) in NBU and Naïve Bayesian algorithm are 87.29%and 82.47%respectively. Thus, NBU algorithm can be used efficiently for landslide hazard analysis and might be widely used for the prediction of various spatial events based on uncertain classification technique.

  16. Radiation dose reduction in computed tomography perfusion using spatial-temporal Bayesian methods

    Science.gov (United States)

    Fang, Ruogu; Raj, Ashish; Chen, Tsuhan; Sanelli, Pina C.

    2012-03-01

    In current computed tomography (CT) examinations, the associated X-ray radiation dose is of significant concern to patients and operators, especially CT perfusion (CTP) imaging that has higher radiation dose due to its cine scanning technique. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) parameter as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and degrade CT perfusion maps greatly if no adequate noise control is applied during image reconstruction. To capture the essential dynamics of CT perfusion, a simple spatial-temporal Bayesian method that uses a piecewise parametric model of the residual function is used, and then the model parameters are estimated from a Bayesian formulation of prior smoothness constraints on perfusion parameters. From the fitted residual function, reliable CTP parameter maps are obtained from low dose CT data. The merit of this scheme exists in the combination of analytical piecewise residual function with Bayesian framework using a simpler prior spatial constrain for CT perfusion application. On a dataset of 22 patients, this dynamic spatial-temporal Bayesian model yielded an increase in signal-tonoise-ratio (SNR) of 78% and a decrease in mean-square-error (MSE) of 40% at low dose radiation of 43mA.

  17. Fast Bayesian optimal experimental design for seismic source inversion

    KAUST Repository

    Long, Quan

    2015-07-01

    We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem. © 2015 Elsevier B.V.

  18. Photogrammetric Reconstruction with Bayesian Information

    Science.gov (United States)

    Masiero, A.; Fissore, F.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2016-06-01

    Nowadays photogrammetry and laser scanning methods are the most wide spread surveying techniques. Laser scanning methods usually allow to obtain more accurate results with respect to photogrammetry, but their use have some issues, e.g. related to the high cost of the instrumentation and the typical need of high qualified personnel to acquire experimental data on the field. Differently, photogrammetric reconstruction can be achieved by means of low cost devices and by persons without specific training. Furthermore, the recent diffusion of smart devices (e.g. smartphones) embedded with imaging and positioning sensors (i.e. standard camera, GNSS receiver, inertial measurement unit) is opening the possibility of integrating more information in the photogrammetric reconstruction procedure, in order to increase its computational efficiency, its robustness and accuracy. In accordance with the above observations, this paper examines and validates new possibilities for the integration of information provided by the inertial measurement unit (IMU) into the photogrammetric reconstruction procedure, and, to be more specific, into the procedure for solving the feature matching and the bundle adjustment problems.

  19. Bayesian parameter estimation for effective field theories

    CERN Document Server

    Wesolowski, S; Furnstahl, R J; Phillips, D R; Thapaliya, A

    2015-01-01

    We present procedures based on Bayesian statistics for effective field theory (EFT) parameter estimation from data. The extraction of low-energy constants (LECs) is guided by theoretical expectations that supplement such information in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools are developed that analyze the fit and ensure that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems and the extraction of LECs for the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.

  20. Applications of Bayesian spectrum representation in acoustics

    Science.gov (United States)

    Botts, Jonathan M.

    This dissertation utilizes a Bayesian inference framework to enhance the solution of inverse problems where the forward model maps to acoustic spectra. A Bayesian solution to filter design inverts a acoustic spectra to pole-zero locations of a discrete-time filter model. Spatial sound field analysis with a spherical microphone array is a data analysis problem that requires inversion of spatio-temporal spectra to directions of arrival. As with many inverse problems, a probabilistic analysis results in richer solutions than can be achieved with ad-hoc methods. In the filter design problem, the Bayesian inversion results in globally optimal coefficient estimates as well as an estimate the most concise filter capable of representing the given spectrum, within a single framework. This approach is demonstrated on synthetic spectra, head-related transfer function spectra, and measured acoustic reflection spectra. The Bayesian model-based analysis of spatial room impulse responses is presented as an analogous problem with equally rich solution. The model selection mechanism provides an estimate of the number of arrivals, which is necessary to properly infer the directions of simultaneous arrivals. Although, spectrum inversion problems are fairly ubiquitous, the scope of this dissertation has been limited to these two and derivative problems. The Bayesian approach to filter design is demonstrated on an artificial spectrum to illustrate the model comparison mechanism and then on measured head-related transfer functions to show the potential range of application. Coupled with sampling methods, the Bayesian approach is shown to outperform least-squares filter design methods commonly used in commercial software, confirming the need for a global search of the parameter space. The resulting designs are shown to be comparable to those that result from global optimization methods, but the Bayesian approach has the added advantage of a filter length estimate within the same unified

  1. Bayesianism and inference to the best explanation

    Directory of Open Access Journals (Sweden)

    Valeriano IRANZO

    2008-01-01

    Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.

  2. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang;

    2006-01-01

    The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used...... by such adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...

  3. Machine learning a Bayesian and optimization perspective

    CERN Document Server

    Theodoridis, Sergios

    2015-01-01

    This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches, which rely on optimization techniques, as well as Bayesian inference, which is based on a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts. The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as shor...

  4. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  5. The Bayesian Who Knew Too Much

    CERN Document Server

    Benétreau-Dupin, Yann

    2014-01-01

    In several papers, John Norton has argued that Bayesianism cannot handle ignorance adequately due to its inability to distinguish between neutral and disconfirming evidence. He argued that this inability sows confusion in, e.g., anthropic reasoning in cosmology or the Doomsday argument, by allowing one to draw unwarranted conclusions from a lack of knowledge. Norton has suggested criteria for a candidate for representation of neutral support. Imprecise credences (families of credal probability functions) constitute a Bayesian-friendly framework that allows us to avoid inadequate neutral priors and better handle ignorance. The imprecise model generally agrees with Norton's representation of ignorance but requires that his criterion of self-duality be reformulated or abandoned

  6. Learning Bayesian networks using genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    Chen Fei; Wang Xiufeng; Rao Yimei

    2007-01-01

    A new method to evaluate the fitness of the Bayesian networks according to the observed data is provided. The main advantage of this criterion is that it is suitable for both the complete and incomplete cases while the others not.Moreover it facilitates the computation greatly. In order to reduce the search space, the notation of equivalent class proposed by David Chickering is adopted. Instead of using the method directly, the novel criterion, variable ordering, and equivalent class are combined,moreover the proposed mthod avoids some problems caused by the previous one. Later, the genetic algorithm which allows global convergence, lack in the most of the methods searching for Bayesian network is applied to search for a good model in thisspace. To speed up the convergence, the genetic algorithm is combined with the greedy algorithm. Finally, the simulation shows the validity of the proposed approach.

  7. Bayesian Population Projections for the United Nations.

    Science.gov (United States)

    Raftery, Adrian E; Alkema, Leontine; Gerland, Patrick

    2014-02-01

    The United Nations regularly publishes projections of the populations of all the world's countries broken down by age and sex. These projections are the de facto standard and are widely used by international organizations, governments and researchers. Like almost all other population projections, they are produced using the standard deterministic cohort-component projection method and do not yield statements of uncertainty. We describe a Bayesian method for producing probabilistic population projections for most countries that the United Nations could use. It has at its core Bayesian hierarchical models for the total fertility rate and life expectancy at birth. We illustrate the method and show how it can be extended to address concerns about the UN's current assumptions about the long-term distribution of fertility. The method is implemented in the R packages bayesTFR, bayesLife, bayesPop and bayesDem.

  8. Approximate Bayesian Computation: a nonparametric perspective

    CERN Document Server

    Blum, Michael

    2010-01-01

    Approximate Bayesian Computation is a family of likelihood-free inference techniques that are well-suited to models defined in terms of a stochastic generating mechanism. In a nutshell, Approximate Bayesian Computation proceeds by computing summary statistics s_obs from the data and simulating summary statistics for different values of the parameter theta. The posterior distribution is then approximated by an estimator of the conditional density g(theta|s_obs). In this paper, we derive the asymptotic bias and variance of the standard estimators of the posterior distribution which are based on rejection sampling and linear adjustment. Additionally, we introduce an original estimator of the posterior distribution based on quadratic adjustment and we show that its bias contains a fewer number of terms than the estimator with linear adjustment. Although we find that the estimators with adjustment are not universally superior to the estimator based on rejection sampling, we find that they can achieve better perfor...

  9. Bayesian information fusion networks for biosurveillance applications.

    Science.gov (United States)

    Mnatsakanyan, Zaruhi R; Burkom, Howard S; Coberly, Jacqueline S; Lombardo, Joseph S

    2009-01-01

    This study introduces new information fusion algorithms to enhance disease surveillance systems with Bayesian decision support capabilities. A detection system was built and tested using chief complaints from emergency department visits, International Classification of Diseases Revision 9 (ICD-9) codes from records of outpatient visits to civilian and military facilities, and influenza surveillance data from health departments in the National Capital Region (NCR). Data anomalies were identified and distribution of time offsets between events in the multiple data streams were established. The Bayesian Network was built to fuse data from multiple sources and identify influenza-like epidemiologically relevant events. Results showed increased specificity compared with the alerts generated by temporal anomaly detection algorithms currently deployed by NCR health departments. Further research should be done to investigate correlations between data sources for efficient fusion of the collected data.

  10. Bayesian Magnetohydrodynamic Seismology of Coronal Loops

    CERN Document Server

    Arregui, Inigo

    2011-01-01

    We perform a Bayesian parameter inference in the context of resonantly damped transverse coronal loop oscillations. The forward problem is solved in terms of parametric results for kink waves in one-dimensional flux tubes in the thin tube and thin boundary approximations. For the inverse problem, we adopt a Bayesian approach to infer the most probable values of the relevant parameters, for given observed periods and damping times, and to extract their confidence levels. The posterior probability distribution functions are obtained by means of Markov Chain Monte Carlo simulations, incorporating observed uncertainties in a consistent manner. We find well localized solutions in the posterior probability distribution functions for two of the three parameters of interest, namely the Alfven travel time and the transverse inhomogeneity length-scale. The obtained estimates for the Alfven travel time are consistent with previous inversion results, but the method enables us to additionally constrain the transverse inho...

  11. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  12. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  13. Bayesian Peak Picking for NMR Spectra

    KAUST Repository

    Cheng, Yichen

    2014-02-01

    Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.

  14. A Bayesian approach to person perception.

    Science.gov (United States)

    Clifford, C W G; Mareschal, I; Otsuka, Y; Watson, T L

    2015-11-01

    Here we propose a Bayesian approach to person perception, outlining the theoretical position and a methodological framework for testing the predictions experimentally. We use the term person perception to refer not only to the perception of others' personal attributes such as age and sex but also to the perception of social signals such as direction of gaze and emotional expression. The Bayesian approach provides a formal description of the way in which our perception combines current sensory evidence with prior expectations about the structure of the environment. Such expectations can lead to unconscious biases in our perception that are particularly evident when sensory evidence is uncertain. We illustrate the ideas with reference to our recent studies on gaze perception which show that people have a bias to perceive the gaze of others as directed towards themselves. We also describe a potential application to the study of the perception of a person's sex, in which a bias towards perceiving males is typically observed.

  15. Bayesian parameter estimation for effective field theories

    Science.gov (United States)

    Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.

    2016-07-01

    We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.

  16. BONNSAI: correlated stellar observables in Bayesian methods

    CERN Document Server

    Schneider, F R N; Fossati, L; Langer, N; de Koter, A

    2016-01-01

    In an era of large spectroscopic surveys of stars and big data, sophisticated statistical methods become more and more important in order to infer fundamental stellar parameters such as mass and age. Bayesian techniques are powerful methods because they can match all available observables simultaneously to stellar models while taking prior knowledge properly into account. However, in most cases it is assumed that observables are uncorrelated which is generally not the case. Here, we include correlations in the Bayesian code BONNSAI by incorporating the covariance matrix in the likelihood function. We derive a parametrisation of the covariance matrix that, in addition to classical uncertainties, only requires the specification of a correlation parameter that describes how observables co-vary. Our correlation parameter depends purely on the method with which observables have been determined and can be analytically derived in some cases. This approach therefore has the advantage that correlations can be accounte...

  17. Bayesian optimal classification of metallic objects: a comparison of time-domain and frequency-domain EMI performance

    Science.gov (United States)

    Gao, Ping; Collins, Leslie M.; Carin, Lawrence

    2000-08-01

    Traditionally, field EMI sensors are operated in the time- domain. The time-domain (TD) EMI sensor usually is a pulsed system. It contains both a transmitting coil and a receiving coil. After transmitting an excitation pulse, which generates the primary field, the receiving coil records the secondary field in the late time. Since a TD EMI sensor measures only the late-time responses, the information contained in the early time response is lost thus limiting the types of objects that can be discriminated. Alternatively, EMI sensors can be operated in the frequency- domain (FD). In this case, the excitations are sinusoidal signals and the sensor measures the static response. The advantages and disadvantages of TD and FD EMI sensors are reviewed in this paper. For landmine and UXO detection, discrimination of targets of interest from clutter is required, since the cost of large false alarm rates is substantial amounts of money, labor and time. In order to discriminate targets from clutter, Bayesian optimal classifiers are derived. Traditional detectors for these applications only utilize the energy of the signal at the position under test or the output of a matched world scenario, the depth of the underground objects is uncertain. The optimal classifier that we utilize takes these uncertainties into account also. In this paper, we present classification performance for four metal objects using TD and FD EMI data. Experimental data were taken with the PSS- 12, a standard army issued metal detector, and the GEM-3, a prototype frequency-domain EMI sensor. Although the optimal classifier improves performance for both TD and FD data, FD classification rate are higher than those for TD systems. The theoretical basis for this result is explored.

  18. Group sequential control of overall toxicity incidents in clinical trials - non-Bayesian and Bayesian approaches.

    Science.gov (United States)

    Yu, Jihnhee; Hutson, Alan D; Siddiqui, Adnan H; Kedron, Mary A

    2016-02-01

    In some small clinical trials, toxicity is not a primary endpoint; however, it often has dire effects on patients' quality of life and is even life-threatening. For such clinical trials, rigorous control of the overall incidence of adverse events is desirable, while simultaneously collecting safety information. In this article, we propose group sequential toxicity monitoring strategies to control overall toxicity incidents below a certain level as opposed to performing hypothesis testing, which can be incorporated into an existing study design based on the primary endpoint. We consider two sequential methods: a non-Bayesian approach in which stopping rules are obtained based on the 'future' probability of an excessive toxicity rate; and a Bayesian adaptation modifying the proposed non-Bayesian approach, which can use the information obtained at interim analyses. Through an extensive Monte Carlo study, we show that the Bayesian approach often provides better control of the overall toxicity rate than the non-Bayesian approach. We also investigate adequate toxicity estimation after the studies. We demonstrate the applicability of our proposed methods in controlling the symptomatic intracranial hemorrhage rate for treating acute ischemic stroke patients.

  19. The Size-Weight Illusion is not anti-Bayesian after all: a unifying Bayesian account.

    Science.gov (United States)

    Peters, Megan A K; Ma, Wei Ji; Shams, Ladan

    2016-01-01

    When we lift two differently-sized but equally-weighted objects, we expect the larger to be heavier, but the smaller feels heavier. However, traditional Bayesian approaches with "larger is heavier" priors predict the smaller object should feel lighter; this Size-Weight Illusion (SWI) has thus been labeled "anti-Bayesian" and has stymied psychologists for generations. We propose that previous Bayesian approaches neglect the brain's inference process about density. In our Bayesian model, objects' perceived heaviness relationship is based on both their size and inferred density relationship: observers evaluate competing, categorical hypotheses about objects' relative densities, the inference about which is then used to produce the final estimate of weight. The model can qualitatively and quantitatively reproduce the SWI and explain other researchers' findings, and also makes a novel prediction, which we confirmed. This same computational mechanism accounts for other multisensory phenomena and illusions; that the SWI follows the same process suggests that competitive-prior Bayesian inference can explain human perception across many domains.

  20. Bayesian nonparametric regression with varying residual density

    OpenAIRE

    Pati, Debdeep; Dunson, David B.

    2013-01-01

    We consider the problem of robust Bayesian inference on the mean regression function allowing the residual density to change flexibly with predictors. The proposed class of models is based on a Gaussian process prior for the mean regression function and mixtures of Gaussians for the collection of residual densities indexed by predictors. Initially considering the homoscedastic case, we propose priors for the residual density based on probit stick-breaking (PSB) scale mixtures and symmetrized ...

  1. Towards Bayesian Deep Learning: A Survey

    OpenAIRE

    Wang, Hao; Yeung, Dit-Yan

    2016-01-01

    While perception tasks such as visual object recognition and text understanding play an important role in human intelligence, the subsequent tasks that involve inference, reasoning and planning require an even higher level of intelligence. The past few years have seen major advances in many perception tasks using deep learning models. For higher-level inference, however, probabilistic graphical models with their Bayesian nature are still more powerful and flexible. To achieve integrated intel...

  2. Improving Environmental Scanning Systems Using Bayesian Networks

    OpenAIRE

    Simon Welter; Jörg H. Mayer; Reiner Quick

    2013-01-01

    As companies’ environment is becoming increasingly volatile, scanning systems gain in importance. We propose a hybrid process model for such systems' information gathering and interpretation tasks that combines quantitative information derived from regression analyses and qualitative knowledge from expert interviews. For the latter, we apply Bayesian networks. We derive the need for such a hybrid process model from a literature review. We lay out our model to find a suitable set of business e...

  3. Approximate Bayesian inference for complex ecosystems

    OpenAIRE

    Michael P H Stumpf

    2014-01-01

    Mathematical models have been central to ecology for nearly a century. Simple models of population dynamics have allowed us to understand fundamental aspects underlying the dynamics and stability of ecological systems. What has remained a challenge, however, is to meaningfully interpret experimental or observational data in light of mathematical models. Here, we review recent developments, notably in the growing field of approximate Bayesian computation (ABC), that allow us to calibrate mathe...

  4. Forming Object Concept Using Bayesian Network

    OpenAIRE

    Nakamura, Tomoaki; Nagai, Takayuki

    2010-01-01

    This chapter hase discussed a novel framework for object understanding. Implementation of the proposed framework using Bayesian Network has been presented. Although the result given in this paper is preliminary one, we have shown that the system can form object concept by observing the performance by human hands. The on-line learning is left for the future works. Moreover the model should be extended so that it can represent the object usage and work objects.

  5. Bayesian belief networks in business continuity.

    Science.gov (United States)

    Phillipson, Frank; Matthijssen, Edwin; Attema, Thomas

    2014-01-01

    Business continuity professionals aim to mitigate the various challenges to the continuity of their company. The goal is a coherent system of measures that encompass detection, prevention and recovery. Choices made in one part of the system affect other parts as well as the continuity risks of the company. In complex organisations, however, these relations are far from obvious. This paper proposes the use of Bayesian belief networks to expose these relations, and presents a modelling framework for this approach. PMID:25193453

  6. Informed Source Separation: A Bayesian Tutorial

    OpenAIRE

    Knuth, Kevin

    2013-01-01

    Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In this tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea...

  7. Market Segmentation Using Bayesian Model Based Clustering

    OpenAIRE

    Van Hattum, P.

    2009-01-01

    This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...

  8. Approximate Bayesian computation in population genetics.

    OpenAIRE

    Beaumont, Mark A; Zhang, Wenyang; Balding, David J.

    2002-01-01

    We propose a new method for approximate Bayesian statistical inference on the basis of summary statistics. The method is suited to complex problems that arise in population genetics, extending ideas developed in this setting by earlier authors. Properties of the posterior distribution of a parameter, such as its mean or density curve, are approximated without explicit likelihood calculations. This is achieved by fitting a local-linear regression of simulated parameter values on simulated summ...

  9. Bayesian nonparametric duration model with censorship

    Directory of Open Access Journals (Sweden)

    Joseph Hakizamungu

    2007-10-01

    Full Text Available This paper is concerned with nonparametric i.i.d. durations models censored observations and we establish by a simple and unified approach the general structure of a bayesian nonparametric estimator for a survival function S. For Dirichlet prior distributions, we describe completely the structure of the posterior distribution of the survival function. These results are essentially supported by prior and posterior independence properties.

  10. Bayesian modeling and classification of neural signals

    OpenAIRE

    Lewicki, Michael S.

    1994-01-01

    Signal processing and classification algorithms often have limited applicability resulting from an inaccurate model of the signal's underlying structure. We present here an efficient, Bayesian algorithm for modeling a signal composed of the superposition of brief, Poisson-distributed functions. This methodology is applied to the specific problem of modeling and classifying extracellular neural waveforms which are composed of a superposition of an unknown number of action potentials CAPs). ...

  11. Bayesian Estimation and Inference Using Stochastic Electronics.

    Science.gov (United States)

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream. PMID:27047326

  12. Bayesian biclustering of gene expression data

    OpenAIRE

    Liu Jun S; Gu Jiajun

    2008-01-01

    Abstract Background Biclustering of gene expression data searches for local patterns of gene expression. A bicluster (or a two-way cluster) is defined as a set of genes whose expression profiles are mutually similar within a subset of experimental conditions/samples. Although several biclustering algorithms have been studied, few are based on rigorous statistical models. Results We developed a Bayesian biclustering model (BBC), and implemented a Gibbs sampling procedure for its statistical in...

  13. Nonparametric Bayesian Storyline Detection from Microtexts

    OpenAIRE

    Krishnan, Vinodh; Eisenstein, Jacob

    2016-01-01

    News events and social media are composed of evolving storylines, which capture public attention for a limited period of time. Identifying these storylines would enable many high-impact applications, such as tracking public interest and opinion in ongoing crisis events. However, this requires integrating temporal and linguistic information, and prior work takes a largely heuristic approach. We present a novel online non-parametric Bayesian framework for storyline detection, using the distance...

  14. Dual Control for Approximate Bayesian Reinforcement Learning

    OpenAIRE

    Klenske, Edgar D.; Hennig, Philipp

    2015-01-01

    Control of non-episodic, finite-horizon dynamical systems with uncertain dynamics poses a tough and elementary case of the exploration-exploitation trade-off. Bayesian reinforcement learning, reasoning about the effect of actions and future observations, offers a principled solution, but is intractable. We review, then extend an old approximate approach from control theory---where the problem is known as dual control---in the context of modern regression methods, specifically generalized line...

  15. A Bayesian framework for robotic programming

    OpenAIRE

    Lebeltel, Olivier; Diard, Julien; Bessiere, Pierre; Mazer, Emmanuel

    2000-01-01

    We propose an original method for programming robots based on Bayesian inference and learning. This method formally deals with problems of uncertainty and incomplete information that are inherent to the field. Indeed, the principal difficulties of robot programming comes from the unavoidable incompleteness of the models used. We present the formalism for describing a robotic task as well as the resolution methods. This formalism is inspired by the theory of probability, suggested by the physi...

  16. Constrained bayesian inference of project performance models

    OpenAIRE

    Sunmola, Funlade

    2013-01-01

    Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...

  17. Bayesian Estimation and Inference Using Stochastic Electronics.

    Science.gov (United States)

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.

  18. Bayesian mixture models for Poisson astronomical images

    OpenAIRE

    Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker

    2012-01-01

    Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...

  19. Bayesian Variable Selection via Particle Stochastic Search.

    Science.gov (United States)

    Shi, Minghui; Dunson, David B

    2011-02-01

    We focus on Bayesian variable selection in regression models. One challenge is to search the huge model space adequately, while identifying high posterior probability regions. In the past decades, the main focus has been on the use of Markov chain Monte Carlo (MCMC) algorithms for these purposes. In this article, we propose a new computational approach based on sequential Monte Carlo (SMC), which we refer to as particle stochastic search (PSS). We illustrate PSS through applications to linear regression and probit models.

  20. Bayesian Spatial Modelling with R-INLA

    OpenAIRE

    Finn Lindgren; Håvard Rue

    2015-01-01

    The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA) approach proposed by Rue, Martino, and Chopin (2009) is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. Combined with the stochastic...

  1. GNSS Software Receiver for UAVs

    DEFF Research Database (Denmark)

    Olesen, Daniel Madelung; Jakobsen, Jakob; von Benzon, Hans-Henrik;

    2016-01-01

    This paper describes the current activities of GPS/GNSS Software receiver development at DTU Space. GNSS Software receivers have received a great deal of attention in the last two decades and numerous implementations have already been presented. DTU Space has just recently started development of ...... of our own GNSS software-receiver targeted for mini UAV applications, and we will in in this paper present our current progress and briefly discuss the benefits of Software Receivers in relation to our research interests....

  2. Final Safety Evaluation Report to license the construction and operation of a facility to receive, store, and dispose of 11e.(2) byproduct material near Clive, Utah (Docket No. 40-8989)

    Energy Technology Data Exchange (ETDEWEB)

    1994-01-01

    The Final Safety Evaluation Report (FSER) summarizes the US Nuclear Regulatory Commission (NRC) staff`s review of Envirocare of Utah, Inc.`s (Envirocare`s) application for a license to receive, store, and dispose of uranium and thorium byproduct material (as defined in Section 11e.(2) of the Atomic Energy Act of 1954, as amended) at a site near Clive, Utah. Envirocare proposes to dispose of high-volume, low-activity Section 11e.(2) byproduct material in separate earthen disposal cells on a site where the applicant currently disposes of naturally occurring radioactive material (NORM), low-level waste, and mixed waste under license by the Utah Department of Environmental Quality. The NRC staff review of the December 23, 1991, license application, as revised by page changes dated July 2 and August 10, 1992, April 5, 7, and 10, 1993, and May 3, 6, 7, 11, and 21, 1993, has identified open issues in geotechnical engineering, water resources protection, radon attenuation, financial assurance, and radiological safety. The NRC will not issue a license for the proposed action until Envirocare adequately resolves these open issues.

  3. Sparse Bayesian learning in ISAR tomography imaging

    Institute of Scientific and Technical Information of China (English)

    SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang

    2015-01-01

    Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.

  4. Particle identification in ALICE: a Bayesian approach

    CERN Document Server

    Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Vargas Trevino, Aurora Diozcora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...

  5. Bayesian Analysis of Individual Level Personality Dynamics

    Directory of Open Access Journals (Sweden)

    Edward Cripps

    2016-07-01

    Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  6. Bayesian Recurrent Neural Network for Language Modeling.

    Science.gov (United States)

    Chien, Jen-Tzung; Ku, Yuan-Chu

    2016-02-01

    A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.

  7. Bayesian Analysis of Individual Level Personality Dynamics

    Science.gov (United States)

    Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  8. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  9. Bayesian Methods for Radiation Detection and Dosimetry

    International Nuclear Information System (INIS)

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed compartmental activities. From the estimated probability densities of the model parameters we were able to derive the densities for compartmental activities for a two compartment catenary model at different times. We also calculated the average activities and their standard deviation for a simple two compartment model

  10. Bayesian and Dempster–Shafer fusion

    Indian Academy of Sciences (India)

    Subhash Challa; Don Koks

    2004-04-01

    The Kalman Filter is traditionally viewed as a prediction–correction filtering algorithm. In this work we show that it can be viewed as a Bayesian fusion algorithm and derive it using Bayesian arguments. We begin with an outline of Bayes theory, using it to discuss well-known quantities such as priors, likelihood and posteriors, and we provide the basic Bayesian fusion equation. We derive the Kalman Filter from this equation using a novel method to evaluate the Chapman–Kolmogorov prediction integral. We then use the theory to fuse data from multiple sensors. Vying with this approach is the Dempster–Shafer theory, which deals with measures of “belief”, and is based on the nonclassical idea of “mass” as opposed to probability. Although these two measures look very similar, there are some differences. We point them out through outlining the ideas of the Dempster– Shafer theory and presenting the basic Dempster–Shafer fusion equation. Finally we compare the two methods, and discuss the relative merits and demerits using an illustrative example.

  11. Bayesian Analysis of Individual Level Personality Dynamics.

    Science.gov (United States)

    Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  12. CERN apprentice receives award

    CERN Multimedia

    2008-01-01

    Another CERN apprentice has received an award for the quality of his work. Stéphane Küng (centre), at the UIG ceremony last November, presided over by Geneva State Councillor Pierre-François Unger, Head of the Department of Economics and Health. Electronics technician Stéphane Küng was honoured in November by the Social Foundation of the Union Industrielle Genevoise (UIG) as one of Geneva’s eight best apprentices in the field of mechatronics. The 20-year-old Genevan obtained his Federal apprentice’s certificate (Certificat fédéral de capacité - CFC) in June 2007, achieving excellent marks in his written tests at the Centre d’Enseignement Professionnel Technique et Artisanal (CEPTA). Like more than 200 youngsters before him, Stéphane Küng spent part of his four-year sandwich course working at CERN, where he followed many practical training courses and gained valuable hands-on experience in various technical groups and labs. "It’ always very gr...

  13. Posterior Consistency of the Bayesian Approach to Linear Ill-Posed Inverse Problems

    CERN Document Server

    Agapiou, Sergios; Stuart, Andrew M

    2012-01-01

    We consider a Bayesian nonparametric approach to a family of linear inverse problems in a separable Hilbert space setting, with Gaussian prior and noise distribution. A method of identifying the posterior distribution using its precision operator is presented. Working with the unbounded precision operator enables us to use partial differential equations (PDE) methodology to study posterior consistency in a frequentist sense, and in particular to obtain rates of contraction of the posterior distribution to a Dirac measure centered on the true solution. We show how these rates may be optimized by a choice of the scale parameter in the prior covariance operator. Our methods assume a relatively weak relation between the prior covariance operator, the forward operator and the noise covariance operator; more precisely, we assume that appropriate powers of these operators induce equivalent norms. We compare our results to known minimax rates of convergence in the case where the forward operator and the prior and noi...

  14. A HYBRID APPROACH FOR RELIABILITY ANALYSIS BASED ON ANALYTIC HIERARCHY PROCESS (AHP) AND BAYESIAN NETWORK (BN)

    OpenAIRE

    Muhammad eZubair

    2014-01-01

    The investigation of the nuclear accidents reveals that the accumulation of various technical and nontechnical lapses compounded the nuclear disaster. By using Analytic Hierarchy Process (AHP) and Bayesian Network (BN) the present research signifies the technical and nontechnical issues of nuclear accidents. The study exposed that besides technical fixes such as enhanced engineering safety features and better siting choices, the critical ingredient for safe operation of nuclear reactors lie i...

  15. Bayesian model for strategic level risk assessment in continuing airthworthiness of air transport

    OpenAIRE

    Jayakody-Arachchige, Dhanapala

    2010-01-01

    Continuing airworthiness (CAW) of aircraft is an essential pre-requisite for the safe operation of air transport. Human errors that occur in CAW organizations and processes could undermine the airworthiness and constitute a risk to flight safety. This thesis reports on a generic Bayesian model that has been designed to assess and quantify this risk. The model removes the vagueness inherent in the subjective methods of assessment of risk and its qualitative expression. Instead, relying on a...

  16. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    Directory of Open Access Journals (Sweden)

    Rasheda Arman Chowdhury

    Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  17. Bayesian Inference of Genetic Regulatory Networks from Time Series Microarray Data Using Dynamic Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Yufei Huang

    2007-06-01

    Full Text Available Reverse engineering of genetic regulatory networks from time series microarray data are investigated. We propose a dynamic Bayesian networks (DBNs modeling and a full Bayesian learning scheme. The proposed DBN directly models the continuous expression levels and also is associated with parameters that indicate the degree as well as the type of regulations. To learn the network from data, we proposed a reversible jump Markov chain Monte Carlo (RJMCMC algorithm. The RJMCMC algorithm can provide not only more accurate inference results than the deterministic alternative algorithms but also an estimate of the a posteriori probabilities (APPs of the network topology. The estimated APPs provide useful information on the confidence of the inferred results and can also be used for efficient Bayesian data integration. The proposed approach is tested on yeast cell cycle microarray data and the results are compared with the KEGG pathway map.

  18. Learning Local Components to Understand Large Bayesian Networks

    DEFF Research Database (Denmark)

    Zeng, Yifeng; Xiang, Yanping; Cordero, Jorge;

    2009-01-01

    Bayesian networks are known for providing an intuitive and compact representation of probabilistic information and allowing the creation of models over a large and complex domain. Bayesian learning and reasoning are nontrivial for a large Bayesian network. In parallel, it is a tough job for users...... (domain experts) to extract accurate information from a large Bayesian network due to dimensional difficulty. We define a formulation of local components and propose a clustering algorithm to learn such local components given complete data. The algorithm groups together most inter-relevant attributes...... in a domain. We evaluate its performance on three benchmark Bayesian networks and provide results in support. We further show that the learned components may represent local knowledge more precisely in comparison to the full Bayesian networks when working with a small amount of data....

  19. Bayesian networks as a tool for epidemiological systems analysis

    OpenAIRE

    Lewis, F.I.

    2012-01-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter ...

  20. Small sample Bayesian analyses in assessment of weapon performance

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Abundant test data are required in assessment of weapon performance.When weapon test data are insufficient,Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations.The several Bayesian approaches are discussed and some limitations are founded.An improvement is put forward after limitations of Bayesian approaches available are analyzed and t he improved approach is applied to assessment of some new weapon performance.

  1. BAYESIAN ESTIMATION OF RELIABILITY IN TWOPARAMETER GEOMETRIC DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Sudhansu S. Maiti

    2015-12-01

    Full Text Available Bayesian estimation of reliability of a component, tR ( = P(X ≥ t, when X follows two-parameter geometric distribution, has been considered. Maximum Likelihood Estimator (MLE, an Unbiased Estimator and Bayesian Estimator have been compared. Bayesian estimation of component reliability R = P ( X ≤ Y , arising under stress-strength setup, when Y is assumed to follow independent two-parameter geometric distribution has also been discussed assuming independent priors for parameters under different loss functions.

  2. Chain ladder method: Bayesian bootstrap versus classical bootstrap

    OpenAIRE

    Peters, Gareth W.; Mario V. W\\"uthrich; Shevchenko, Pavel V.

    2010-01-01

    The intention of this paper is to estimate a Bayesian distribution-free chain ladder (DFCL) model using approximate Bayesian computation (ABC) methodology. We demonstrate how to estimate quantities of interest in claims reserving and compare the estimates to those obtained from classical and credibility approaches. In this context, a novel numerical procedure utilising Markov chain Monte Carlo (MCMC), ABC and a Bayesian bootstrap procedure was developed in a truly distribution-free setting. T...

  3. A tutorial introduction to Bayesian models of cognitive development

    OpenAIRE

    Perfors, Amy; Tenenbaum, Joshua B.; Griffiths, Thomas L.; Xu, Fei

    2010-01-01

    We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in...

  4. Bayesian just-so stories in psychology and neuroscience

    OpenAIRE

    Bowers, J.S.; Davis, Colin J

    2012-01-01

    According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make three main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak at best. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account fo...

  5. Bayesian just-so stories in cognitive psychology and neuroscience.

    OpenAIRE

    Bowers, J.S.; Davis, Colin J

    2012-01-01

    According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make three main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak at best. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account fo...

  6. The Bayesian Modelling Of Inflation Rate In Romania

    OpenAIRE

    Mihaela Simionescu

    2014-01-01

    Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estim...

  7. Bayesian non- and semi-parametric methods and applications

    CERN Document Server

    Rossi, Peter

    2014-01-01

    This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number

  8. Doing bayesian data analysis a tutorial with R and BUGS

    CERN Document Server

    Kruschke, John K

    2011-01-01

    There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all

  9. Bayesian missing data problems EM, data augmentation and noniterative computation

    CERN Document Server

    Tan, Ming T; Ng, Kai Wang

    2009-01-01

    Bayesian Missing Data Problems: EM, Data Augmentation and Noniterative Computation presents solutions to missing data problems through explicit or noniterative sampling calculation of Bayesian posteriors. The methods are based on the inverse Bayes formulae discovered by one of the author in 1995. Applying the Bayesian approach to important real-world problems, the authors focus on exact numerical solutions, a conditional sampling approach via data augmentation, and a noniterative sampling approach via EM-type algorithms. After introducing the missing data problems, Bayesian approach, and poste

  10. Bayesian integer frequency offset estimator for MIMO-OFDM systems

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Carrier frequency offset (CFO) in MIMO-OFDM systems can be decoupled into two parts: fraction frequency offset (FFO) and integer frequency offset (IFO). The problem of IFO estimation is addressed and a new IFO estimator based on the Bayesian philosophy is proposed. Also, it is shown that the Bayesian IFO estimator is optimal among all the IFO estimators. Furthermore, the Bayesian estimator can take advantage of oversampling so that better performance can be obtained. Finally, numerical results show the optimality of the Bayesian estimator and validate the theoretical analysis.

  11. The bugs book a practical introduction to Bayesian analysis

    CERN Document Server

    Lunn, David; Best, Nicky; Thomas, Andrew; Spiegelhalter, David

    2012-01-01

    Introduction: Probability and ParametersProbabilityProbability distributionsCalculating properties of probability distributionsMonte Carlo integrationMonte Carlo Simulations Using BUGSIntroduction to BUGSDoodleBUGSUsing BUGS to simulate from distributionsTransformations of random variablesComplex calculations using Monte CarloMultivariate Monte Carlo analysisPredictions with unknown parametersIntroduction to Bayesian InferenceBayesian learningPosterior predictive distributionsConjugate Bayesian inferenceInference about a discrete parameterCombinations of conjugate analysesBayesian and classica

  12. A Bayesian Justification for Random Sampling in Sample Survey

    Directory of Open Access Journals (Sweden)

    Glen Meeden

    2012-07-01

    Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.

  13. Transient simulation of molten salt central receiver

    Science.gov (United States)

    Doupis, Dimitri; Wang, Chuan; Carcorze-Soto, Jorge; Chen, Yen-Ming; Maggi, Andrea; Losito, Matteo; Clark, Michael

    2016-05-01

    Alstom is developing concentrated solar power (CSP) utilizing 60/40wt% NaNO3-KNO3 molten salt as the working fluid in a tower receiver for the global renewable energy market. In the CSP power generation cycle, receivers undergo a daily cyclic operation due to the transient nature of solar energy. Development of robust and efficient start-up and shut-down procedures is critical to avoiding component failures due to mechanical fatigue resulting from thermal transients, thus maintaining the performance and availability of the CSP plant. The Molten Salt Central Receiver (MSCR) is subject to thermal transients during normal daily operation, a cycle that includes warmup, filling, operation, draining, and shutdown. This paper describes a study to leverage dynamic simulation and finite element analysis (FEA) in development of start-up, shutdown, and transient operation concepts for the MSCR. The results of the FEA also verify the robustness of the MSCR design to the thermal transients anticipated during the operation of the plant.

  14. Bayesian Processor of Ensemble for Precipitation Forecasting: A Development Plan

    Science.gov (United States)

    Toth, Z.; Krzysztofowicz, R.

    2006-05-01

    The Bayesian Processor of Ensemble (BPE) is a new, theoretically-based technique for probabilistic forecasting of weather variates. It is a generalization of the Bayesian Processor of Output (BPO) developed by Krzysztofowicz and Maranzano for processing single values of multiple predictors into a posterior distribution function of a predictand. The BPE processes an ensemble of a predictand generated by multiple integrations of a numerical weather prediction (NWP) model, and optimally fuses the ensemble with climatic data in order to quantify uncertainty about the predictand. As is well known, Bayes theorem provides the optimal theoretical framework for fusing information from different sources and for obtaining the posterior distribution function of a predictand. Using a family of such distribution functions, a given raw ensemble can be mapped into a posterior ensemble, which is well calibrated, has maximum informativeness, and preserves the spatio-temporal and cross-variate dependence structure of the NWP output fields. The challenge is to develop and test the BPE suitable for operational forecasting. This talk will present the basic design components of the BPE, along with a discussion of the climatic and training data to be used in its potential application at the National Centers for Environmental Prediction (NCEP). The technique will be tested first on quasi-normally distributed variates and next on precipitation variates. For reasons of economy, the BPE will be applied on the relatively coarse resolution grid corresponding to the ensemble output, and then the posterior ensemble will be downscaled to finer grids such as that of the National Digital Forecast Database (NDFD).

  15. Python Environment for Bayesian Learning: Inferring the Structure of Bayesian Networks from Knowledge and Data.

    Science.gov (United States)

    Shah, Abhik; Woolf, Peter

    2009-06-01

    In this paper, we introduce pebl, a Python library and application for learning Bayesian network structure from data and prior knowledge that provides features unmatched by alternative software packages: the ability to use interventional data, flexible specification of structural priors, modeling with hidden variables and exploitation of parallel processing. PMID:20161541

  16. GPS Receiver Performance Inspection by Wavelet Transform

    Institute of Scientific and Technical Information of China (English)

    Xia Lin-yuan; Liu Jing-nan; Lu Liang-xi

    2003-01-01

    As a powerful analysis tool and the result of contemporary mathematics development, wavelet transform has shown its promising application potentials through the research in the paper. Three aspects regarding GPS receiver performance is tackled: cycle slip detection, receiver noise analysis and receiver channel bias inspection. Wavelet decomposition for double differential observation has demonstrated that this multi-level transform can reveal cycle slips as small as 0.5 cycles without any pre-adjustment processes or satellite orbit information, it can therefore be regarded as a 'geometry free' method. Based on property assumption of receiver noise, signal of noise serial is obtained at the high frequency scale in wavelet decomposition layers. This kind of noise influence on GPSb aseline result can be effectively eliminated by reconstruction process during wavelet reconstruction. Through observed data analysis, the transform has detected a kind of receiver channel bias that has not been completely removed by processing unit of GPS receiver during clock offset resetting operation. Thus the wavelet approach can be employed as a kind of system diagnosis in a generalized point of view.

  17. Bayesian inference tools for inverse problems

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2013-08-01

    In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and

  18. Reflux solar receiver design considerations

    Science.gov (United States)

    Diver, R. B.

    Reflux heat-pipe and pool-boiler receivers are being developed to improve upon the performance and life of directly-illuminated tube receiver technology used in previous successful demonstrations of dish-Stirling systems. The design of a reflux receiver involves engineering tradeoffs. In this paper, on-sun performance measurements of the Sandia pool-boiler receiver are compared with results from the reflux receiver thermal analysis model, AEETES. Flux and performance implications of various design options are analyzed and discussed.

  19. Meteorological Data Assimilation by Adaptive Bayesian Optimization.

    Science.gov (United States)

    Purser, Robert James

    1992-01-01

    The principal aim of this research is the elucidation of the Bayesian statistical principles that underlie the theory of objective meteorological analysis. In particular, emphasis is given to aspects of data assimilation that can benefit from an iterative numerical strategy. Two such aspects that are given special consideration are statistical validation of the covariance profiles and nonlinear initialization. A new economic algorithm is presented, based on the imposition of a sparse matrix structure for all covariances and precisions held during the computations. It is shown that very large datasets may be accommodated using this structure and a good linear approximation to the analysis equations established without the need to unnaturally fragment the problem. Since the integrity of the system of analysis equations is preserved, it is a relatively straight-forward matter to extend the basic analysis algorithm to one that incorporates a check on the plausibility of the statistical model assumed for background errors--the so-called "validation" problem. Two methods of validation are described within the sparse matrix framework: the first is essentially a direct extension of the Bayesian principles to embrace, not only the regular analysis variables, but also the parameters that determine the precise form of the covariance functions; the second technique is the non-Bayesian method of generalized cross validation adapted for use within the sparse matrix framework. The later part of this study is concerned with the establishment of a consistent dynamical balance within a forecast model--the initialization problem. The formal principles of the modern theory of initialization are reviewed and a critical examination is made of the concept of the "slow manifold". It is demonstrated, in accordance with more complete nonlinear models, that even within a simple three-mode linearized system, the notion that a universal slow manifold exists is untenable. It is therefore argued

  20. Personalized Audio Systems - a Bayesian Approach

    DEFF Research Database (Denmark)

    Nielsen, Jens Brehm; Jensen, Bjørn Sand; Hansen, Toke Jansen;

    2013-01-01

    Modern audio systems are typically equipped with several user-adjustable parameters unfamiliar to most users listening to the system. To obtain the best possible setting, the user is forced into multi-parameter optimization with respect to the users's own objective and preference. To address this......, the present paper presents a general inter-active framework for personalization of such audio systems. The framework builds on Bayesian Gaussian process regression in which a model of the users's objective function is updated sequentially. The parameter setting to be evaluated in a given trial is...