Sample records for bayesian multi-tissue approach

  1. New Insights into the Genetic Control of Gene Expression using a Bayesian Multi-tissue Approach

    Petretto, E.; Bottolo, L.; Langley, S. R.; Heinig, M.; McDermott-Roe, Ch.; Sarwar, R.; Pravenec, Michal; Hübner, N.; Aitman, T. J.; Cook, S.A.; Richardson, S.


    Roč. 6, č. 4 (2010), e1000737. ISSN 1553-734X R&D Projects: GA ČR(CZ) GA301/08/0166; GA MŠk(CZ) 1M0520; GA ČR GAP301/10/0290 Grant ostatní: EC(XE) LSHG-CT-2005-019015; Fondation Leducq(FR) 06 CVD 03 Institutional research plan: CEZ:AV0Z50110509 Keywords : expression profiles * Bayesian multi- tissue approach * genetical genomics Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 5.515, year: 2010

  2. Bayesian approach to rough set

    Marwala, Tshilidzi


    This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.

  3. A Bayesian approach to model uncertainty

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  4. Bayesian Approach to Handling Informative Sampling

    Sikov, Anna


    In the case of informative sampling the sampling scheme explicitly or implicitly depends on the response variable. As a result, the sample distribution of response variable can- not be used for making inference about the population. In this research I investigate the problem of informative sampling from the Bayesian perspective. Application of the Bayesian approach permits solving the problems, which arise due to complexity of the models, being used for handling informative sampling. The main...




    Full Text Available Management is nowadays a basic vector of economic development, a concept frequently used in our country as well as all over the world. Indifferently of the hierarchical level at which the managerial process is manifested, decision represents its essential moment, the supreme act of managerial activity. Its can be met in all fields of activity, practically having an unlimited degree of coverage, and in all the functions of management. It is common knowledge that the activity of any type of manger, no matter the hierarchical level he occupies, represents a chain of interdependent decisions, their aim being the elimination or limitation of the influence of disturbing factors that may endanger the achievement of predetermined objectives, and the quality of managerial decisions condition the progress and viability of any enterprise. Therefore, one of the principal characteristics of a successful manager is his ability to adopt the most optimal decisions of high quality. The quality of managerial decisions are conditioned by the manager’s general level of education and specialization, the manner in which they are preoccupied to assimilate the latest information and innovations in the domain of management’s theory and practice and the applying of modern managerial methods and techniques in the activity of management. We are presenting below the analysis of decision problems in hazardous conditions in terms of Bayesian theory – a theory that uses the probabilistic calculus.

  6. Comparison of the Bayesian and Frequentist Approach to the Statistics

    Hakala, Michal


    The Thesis deals with introduction to Bayesian statistics and comparing Bayesian approach with frequentist approach to statistics. Bayesian statistics is modern branch of statistics which provides an alternative comprehensive theory to the frequentist approach. Bayesian concepts provides solution for problems not being solvable by frequentist theory. In the thesis are compared definitions, concepts and quality of statistical inference. The main interest is focused on a point estimation, an in...

  7. Integer variables estimation problems: the Bayesian approach

    G. Venuti


    Full Text Available In geodesy as well as in geophysics there are a number of examples where the unknown parameters are partly constrained to be integer numbers, while other parameters have a continuous range of possible values. In all such situations the ordinary least square principle, with integer variates fixed to the most probable integer value, can lead to paradoxical results, due to the strong non-linearity of the manifold of admissible values. On the contrary an overall estimation procedure assigning the posterior distribution to all variables, discrete and continuous, conditional to the observed quantities, like the so-called Bayesian approach, has the advantage of weighting correctly the possible errors in choosing different sets of integer values, thus providing a more realistic and stable estimate even of the continuous parameters. In this paper, after a short recall of the basics of Bayesian theory in section 2, we present the natural Bayesian solution to the problem of assessing the estimable signal from noisy observations in section 3 and the Bayesian solution to cycle slips detection and repair for a stream of GPS measurements in section 4. An elementary synthetic example is discussed in section 3 to illustrate the theory presented and more elaborate, though synthetic, examples are discussed in section 4 where realistic streams of GPS observations, with cycle slips, are simulated and then back processed.

  8. A Bayesian Concept Learning Approach to Crowdsourcing

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;


    We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing...

  9. Particle identification in ALICE: a Bayesian approach

    Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Vargas Trevino, Aurora Diozcora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym


    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...

  10. Refining gene signatures: a Bayesian approach

    Labbe Aurélie


    Full Text Available Abstract Background In high density arrays, the identification of relevant genes for disease classification is complicated by not only the curse of dimensionality but also the highly correlated nature of the array data. In this paper, we are interested in the question of how many and which genes should be selected for a disease class prediction. Our work consists of a Bayesian supervised statistical learning approach to refine gene signatures with a regularization which penalizes for the correlation between the variables selected. Results Our simulation results show that we can most often recover the correct subset of genes that predict the class as compared to other methods, even when accuracy and subset size remain the same. On real microarray datasets, we show that our approach can refine gene signatures to obtain either the same or better predictive performance than other existing methods with a smaller number of genes. Conclusions Our novel Bayesian approach includes a prior which penalizes highly correlated features in model selection and is able to extract key genes in the highly correlated context of microarray data. The methodology in the paper is described in the context of microarray data, but can be applied to any array data (such as micro RNA, for example as a first step towards predictive modeling of cancer pathways. A user-friendly software implementation of the method is available.

  11. A Bayesian approach to earthquake source studies

    Minson, Sarah

    Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also

  12. A Bayesian Approach to Multifractal Extremes

    Tchiguirinskaia, Ioulia; Schertzer, Daniel; Lovejoy, Shaun


    Drivers such as climate change and rapid urbanisation will result in increasing flood problems in urban environments through this century. Problems encountered in existing flood defence strategies are often related to the data non-stationary, long range dependencies and the clustering of extremes often resulting in fat tailed (i.e., a power-law tail) probability distributions. We discuss how to better predict the floods by using a physically based approach established on systems that respect a scale symmetry over a wide range of space-time scales to determine the relationship between flood magnitude and return period for a wide range of aggregation periods. The classical quantile distributions unfortunately rely on two hypotheses that are questionable: stationarity and independency of the components of the time series. We pointed out that beyond the classical sampling of the extremes and its limitations, there is the possibility to eliminate long-range dependency by uncovering a white-noise process whose fractional integration generates the observed long-range dependent process. The results were obtained during the CEATI Project "Multifractals and physically based estimates of extreme floods". The ambition of this project was to investigate very large data sets of reasonable quality (e.g., daily stream flow data recorded for at least 20 years for several thousands of gages distributed all over Canada and the USA). The multifractal parameters such as the mean intermittency parameter and the multifractality index were estimated on 8332 time series. The results confirm the dependence of multifractal parameter estimates on the length of available data. Then developing a metric for parameter estimation error became a principal step in uncertainty evaluation with respect to the multifractal estimates. A technique for estimating confidence intervals with the help of a Bayesian approach was developed. A detailed comparison of multifractal quantile plots and paleoflood data

  13. Modeling Social Annotation: a Bayesian Approach

    Plangprasopchok, Anon


    Collaborative tagging systems, such as, CiteULike, and others, allow users to annotate objects, e.g., Web pages or scientific papers, with descriptive labels called tags. The social annotations, contributed by thousands of users, can potentially be used to infer categorical knowledge, classify documents or recommend new relevant information. Traditional text inference methods do not make best use of socially-generated data, since they do not take into account variations in individual users' perspectives and vocabulary. In a previous work, we introduced a simple probabilistic model that takes interests of individual annotators into account in order to find hidden topics of annotated objects. Unfortunately, our proposed approach had a number of shortcomings, including overfitting, local maxima and the requirement to specify values for some parameters. In this paper we address these shortcomings in two ways. First, we extend the model to a fully Bayesian framework. Second, we describe an infinite ver...

  14. Merging Digital Surface Models Implementing Bayesian Approaches

    Sadeq, H.; Drummond, J.; Li, Z.


    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  15. Process adjustment by a Bayesian approach

    Daniel Duret


    Full Text Available In a production or measure situation, operators are required to make corrections to a process using the measurement of a sample. In both cases, it is always difficult to suggest a correction from a deviation. The correction is the result of two different deviations: one in set-up and the second in production. The latter is considered as noise. The objective of this paper is to propose an original approach to calculate the best correction using a Bayesian approach. A correction formula is given with three assumptions as regards adjusting the distribution: uniform, triangular and normal distribution. This paper gives a graphical interpretation of these different assumptions and a discussion of the results. Based on these results, the paper proposes a practical rule for calculating the most likely maladjustment in the case of a normal distribution. This practical rule gives the best adjustment using a simple relation (Adjustment = K*sample mean where K depends on the sample size, the ratio between the maladjustment and the short-term variability and a Type I risk of large maladjustment.

  16. A new approach for Bayesian model averaging

    TIAN XiangJun; XIE ZhengHui; WANG AiHui; YANG XiaoChun


    Bayesian model averaging (BMA) is a recently proposed statistical method for calibrating forecast ensembles from numerical weather models.However,successful implementation of BMA requires accurate estimates of the weights and variances of the individual competing models in the ensemble.Two methods,namely the Expectation-Maximization (EM) and the Markov Chain Monte Carlo (MCMC) algorithms,are widely used for BMA model training.Both methods have their own respective strengths and weaknesses.In this paper,we first modify the BMA log-likelihood function with the aim of removing the additional limitation that requires that the BMA weights add to one,and then use a limited memory quasi-Newtonian algorithm for solving the nonlinear optimization problem,thereby formulating a new approach for BMA (referred to as BMA-BFGS).Several groups of multi-model soil moisture simulation experiments from three land surface models show that the performance of BMA-BFGS is similar to the MCMC method in terms of simulation accuracy,and that both are superior to the EM algorithm.On the other hand,the computational cost of the BMA-BFGS algorithm is substantially less than for MCMC and is almost equivalent to that for EM.

  17. On an Approach to Bayesian Sample Sizing in Clinical Trials

    Muirhead, Robb J


    This paper explores an approach to Bayesian sample size determination in clinical trials. The approach falls into the category of what is often called "proper Bayesian", in that it does not mix frequentist concepts with Bayesian ones. A criterion for a "successful trial" is defined in terms of a posterior probability, its probability is assessed using the marginal distribution of the data, and this probability forms the basis for choosing sample sizes. We illustrate with a standard problem in clinical trials, that of establishing superiority of a new drug over a control.

  18. A Bayesian approach to particle identification in ALICE

    CERN. Geneva


    Among the LHC experiments, ALICE has unique particle identification (PID) capabilities exploiting different types of detectors. During Run 1, a Bayesian approach to PID was developed and intensively tested. It facilitates the combination of information from different sub-systems. The adopted methodology and formalism as well as the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE will be reviewed. Results are presented with PID performed via measurements of specific energy loss (dE/dx) and time-of-flight using information from the TPC and TOF detectors, respectively. Methods to extract priors from data and to compare PID efficiencies and misidentification probabilities in data and Monte Carlo using high-purity samples of identified particles will be presented. Bayesian PID results were found consistent with previous measurements published by ALICE. The Bayesian PID approach gives a higher signal-to-background ratio and a similar or larger statist...

  19. A Bayesian Approach to Interactive Retrieval

    Tague, Jean M.


    A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…

  20. Bayesian approach to magnetotelluric tensor decomposition

    Michel Menvielle


    ;} -->

    Magnetotelluric directional analysis and impedance tensor decomposition are basic tools to validate a local/regional composite electrical model of the underlying structure. Bayesian stochastic methods approach the problem of the parameter estimation and their uncertainty characterization in a fully probabilistic fashion, through the use of posterior model probabilities.We use the standard Groom­Bailey 3­D local/2­D regional composite model in our bayesian approach. We assume that the experimental impedance estimates are contamined with the Gaussian noise and define the likelihood of a particular composite model with respect to the observed data. We use non­informative, flat priors over physically reasonable intervals for the standard Groom­Bailey decomposition parameters. We apply two numerical methods, the Markov chain Monte Carlo procedure based on the Gibbs sampler and a single­component adaptive Metropolis algorithm. From the posterior samples, we characterize the estimates and uncertainties of the individual decomposition parameters by using the respective marginal posterior probabilities. We conclude that the stochastic scheme performs reliably for a variety of models, including the multisite and multifrequency case with up to

  1. Personalized Audio Systems - a Bayesian Approach

    Nielsen, Jens Brehm; Jensen, Bjørn Sand; Hansen, Toke Jansen;


    Modern audio systems are typically equipped with several user-adjustable parameters unfamiliar to most users listening to the system. To obtain the best possible setting, the user is forced into multi-parameter optimization with respect to the users's own objective and preference. To address this......, the present paper presents a general inter-active framework for personalization of such audio systems. The framework builds on Bayesian Gaussian process regression in which a model of the users's objective function is updated sequentially. The parameter setting to be evaluated in a given trial is...

  2. Macroscopic hotspots identification: A Bayesian spatio-temporal interaction approach.

    Dong, Ni; Huang, Helai; Lee, Jaeyoung; Gao, Mingyun; Abdel-Aty, Mohamed


    This study proposes a Bayesian spatio-temporal interaction approach for hotspot identification by applying the full Bayesian (FB) technique in the context of macroscopic safety analysis. Compared with the emerging Bayesian spatial and temporal approach, the Bayesian spatio-temporal interaction model contributes to a detailed understanding of differential trends through analyzing and mapping probabilities of area-specific crash trends as differing from the mean trend and highlights specific locations where crash occurrence is deteriorating or improving over time. With traffic analysis zones (TAZs) crash data collected in Florida, an empirical analysis was conducted to evaluate the following three approaches for hotspot identification: FB ranking using a Poisson-lognormal (PLN) model, FB ranking using a Bayesian spatial and temporal (B-ST) model and FB ranking using a Bayesian spatio-temporal interaction (B-ST-I) model. The results show that (a) the models accounting for space-time effects perform better in safety ranking than does the PLN model, and (b) the FB approach using the B-ST-I model significantly outperforms the B-ST approach in correctly identifying hotspots by explicitly accounting for the space-time variation in addition to the stable spatial/temporal patterns of crash occurrence. In practice, the B-ST-I approach plays key roles in addressing two issues: (a) how the identified hotspots have evolved over time and (b) the identification of areas that, whilst not yet hotspots, show a tendency to become hotspots. Finally, it can provide guidance to policy decision makers to efficiently improve zonal-level safety. PMID:27110645

  3. A Bayesian Networks approach to Operational Risk

    Aquaro, V.; Bardoscia, M.; Bellotti, R.; Consiglio, A.; De Carlo, F.; Ferri, G.


    A system for Operational Risk management based on the computational paradigm of Bayesian Networks is presented. The algorithm allows the construction of a Bayesian Network targeted for each bank and takes into account in a simple and realistic way the correlations among different processes of the bank. The internal losses are averaged over a variable time horizon, so that the correlations at different times are removed, while the correlations at the same time are kept: the averaged losses are thus suitable to perform the learning of the network topology and parameters; since the main aim is to understand the role of the correlations among the losses, the assessments of domain experts are not used. The algorithm has been validated on synthetic time series. It should be stressed that the proposed algorithm has been thought for the practical implementation in a mid or small sized bank, since it has a small impact on the organizational structure of a bank and requires an investment in human resources which is limited to the computational area.

  4. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Warner, James E.; Hochhalter, Jacob D.


    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  5. Sequential Bayesian technique: An alternative approach for software reliability estimation

    S Chatterjee; S S Alam; R B Misra


    This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become available. The usefulness of the method is demonstrated with some real life data

  6. Remotely sensed monitoring of small reservoir dynamics: a Bayesian approach

    Eilander, D.M.; Annor, F.O.; Iannini, L.; Van de Giesen, N.C.


    Multipurpose small reservoirs are important for livelihoods in rural semi-arid regions. To manage and plan these reservoirs and to assess their hydrological impact at a river basin scale, it is important to monitor their water storage dynamics. This paper introduces a Bayesian approach for monitorin

  7. A Bayesian approach to estimating the prehepatic insulin secretion rate

    Andersen, Kim Emil; Højbjerre, Malene

    the time courses of insulin and C-peptide subsequently are used as known forcing functions. In this work we adopt a Bayesian graphical model to describe the unied model simultaneously. We develop a model that also accounts for both measurement error and process variability. The parameters are estimated...... by a Bayesian approach where efficient posterior sampling is made available through the use of Markov chain Monte Carlo methods. Hereby the ill-posed estimation problem inherited in the coupled differential equation model is regularized by the use of prior knowledge. The method is demonstrated on experimental...

  8. The subjectivity of scientists and the Bayesian approach

    Press, James S


    "Press and Tanur argue that subjectivity has not only played a significant role in the advancement of science but that science will advance more rapidly if the modern methods of Bayesian statistical analysis replace some of the more classical twentieth-century methods." — SciTech Book News. "An insightful work." ― Choice. "Compilation of interesting popular problems … this book is fascinating." — Short Book Reviews, International Statistical Institute. Subjectivity ― including intuition, hunches, and personal beliefs ― has played a key role in scientific discovery. This intriguing book illustrates subjective influences on scientific progress with historical accounts and biographical sketches of more than a dozen luminaries, including Aristotle, Galileo, Newton, Darwin, Pasteur, Freud, Einstein, Margaret Mead, and others. The treatment also offers a detailed examination of the modern Bayesian approach to data analysis, with references to the Bayesian theoretical and applied literature. Suitable for...

  9. Towards a Supra-Bayesian Approach to Merging of Information

    Sečkárová, Vladimíra

    Prague: Institute of Information Theory and Automation, 2011, s. 81-86. ISBN 978-80-903834-6-3. [The 2nd International Workshop od Decision Making with Multiple Imperfect Decision Makers. Held in Conjunction with the 25th Annual Conference on Neural Information Processing Systems (NIPS 2011). Sierra Nevada (ES), 16.12.2011-16.12.2011] R&D Projects: GA MŠk 1M0572; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : decision makers * Supra-Bayesian * Bayesian solution * Merging Subject RIV: BB - Applied Statistics, Operational Research a supra-bayesian approach to merging of information.pdf

  10. Stochastic model updating utilizing Bayesian approach and Gaussian process model

    Wan, Hua-Ping; Ren, Wei-Xin


    Stochastic model updating (SMU) has been increasingly applied in quantifying structural parameter uncertainty from responses variability. SMU for parameter uncertainty quantification refers to the problem of inverse uncertainty quantification (IUQ), which is a nontrivial task. Inverse problem solved with optimization usually brings about the issues of gradient computation, ill-conditionedness, and non-uniqueness. Moreover, the uncertainty present in response makes the inverse problem more complicated. In this study, Bayesian approach is adopted in SMU for parameter uncertainty quantification. The prominent strength of Bayesian approach for IUQ problem is that it solves IUQ problem in a straightforward manner, which enables it to avoid the previous issues. However, when applied to engineering structures that are modeled with a high-resolution finite element model (FEM), Bayesian approach is still computationally expensive since the commonly used Markov chain Monte Carlo (MCMC) method for Bayesian inference requires a large number of model runs to guarantee the convergence. Herein we reduce computational cost in two aspects. On the one hand, the fast-running Gaussian process model (GPM) is utilized to approximate the time-consuming high-resolution FEM. On the other hand, the advanced MCMC method using delayed rejection adaptive Metropolis (DRAM) algorithm that incorporates local adaptive strategy with global adaptive strategy is employed for Bayesian inference. In addition, we propose the use of the powerful variance-based global sensitivity analysis (GSA) in parameter selection to exclude non-influential parameters from calibration parameters, which yields a reduced-order model and thus further alleviates the computational burden. A simulated aluminum plate and a real-world complex cable-stayed pedestrian bridge are presented to illustrate the proposed framework and verify its feasibility.

  11. Airframe integrity based on Bayesian approach

    Hurtado Cahuao, Jose Luis

    Aircraft aging has become an immense challenge in terms of ensuring the safety of the fleet while controlling life cycle costs. One of the major concerns in aircraft structures is the development of fatigue cracks in the fastener holes. A probabilistic-based method has been proposed to manage this problem. In this research, the Bayes' theorem is used to assess airframe integrity by updating generic data with airframe inspection data while such data are compiled. This research discusses the methodology developed for assessment of loss of airframe integrity due to fatigue cracking in the fastener holes of an aging platform. The methodology requires a probability density function (pdf) at the end of SAFE life. Subsequently, a crack growth regime begins. As the Bayesian analysis requires information of a prior initial crack size pdf, such a pdf is assumed and verified to be lognormally distributed. The prior distribution of crack size as cracks grow is modeled through a combined Inverse Power Law (IPL) model and lognormal relationships. The first set of inspections is used as the evidence for updating the crack size distribution at the various stages of aircraft life. Moreover, the materials used in the structural part of the aircrafts have variations in their properties due to their calibration errors and machine alignment. A Matlab routine (PCGROW) is developed to calculate the crack distribution growth through three different crack growth models. As the first step, the material properties and the initial crack size are sampled. A standard Monte Carlo simulation is employed for this sampling process. At the corresponding aircraft age, the crack observed during the inspections, is used to update the crack size distribution and proceed in time. After the updating, it is possible to estimate the probability of structural failure as a function of flight hours for a given aircraft in the future. The results show very accurate and useful values related to the reliability

  12. A Bayesian approach to simultaneously quantify assignments and linguistic uncertainty

    Chavez, Gregory M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC FREDERICKSBURG; Ross, Timothy J [UNM


    Subject matter expert assessments can include both assignment and linguistic uncertainty. This paper examines assessments containing linguistic uncertainty associated with a qualitative description of a specific state of interest and the assignment uncertainty associated with assigning a qualitative value to that state. A Bayesian approach is examined to simultaneously quantify both assignment and linguistic uncertainty in the posterior probability. The approach is applied to a simplified damage assessment model involving both assignment and linguistic uncertainty. The utility of the approach and the conditions under which the approach is feasible are examined and identified.

  13. Approach to the Correlation Discovery of Chinese Linguistic Parameters Based on Bayesian Method

    WANG Wei(王玮); CAI LianHong(蔡莲红)


    Bayesian approach is an important method in statistics. The Bayesian belief network is a powerful knowledge representation and reasoning tool under the conditions of uncertainty.It is a graphics model that encodes probabilistic relationships among variables of interest. In this paper, an approach to Bayesian network construction is given for discovering the Chinese linguistic parameter relationship in the corpus.

  14. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Dilip Swaminathan


    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  15. A Bayesian Approach to Protein Inference Problem in Shotgun Proteomics

    Li, Yong Fuga; Arnold, Randy J.; Li, Yixue; Radivojac, Predrag; Sheng, Quanhu; Tang, Haixu


    The protein inference problem represents a major challenge in shotgun proteomics. In this article, we describe a novel Bayesian approach to address this challenge by incorporating the predicted peptide detectabilities as the prior probabilities of peptide identification. We propose a rigorious probabilistic model for protein inference and provide practical algoritmic solutions to this problem. We used a complex synthetic protein mixture to test our method and obtained promising results.

  16. A bayesian approach to laboratory utilization management

    Ronald G Hauser


    Full Text Available Background: Laboratory utilization management describes a process designed to increase healthcare value by altering requests for laboratory services. A typical approach to monitor and prioritize interventions involves audits of laboratory orders against specific criteria, defined as rule-based laboratory utilization management. This approach has inherent limitations. First, rules are inflexible. They adapt poorly to the ambiguity of medical decision-making. Second, rules judge the context of a decision instead of the patient outcome allowing an order to simultaneously save a life and break a rule. Third, rules can threaten physician autonomy when used in a performance evaluation. Methods: We developed an alternative to rule-based laboratory utilization. The core idea comes from a formula used in epidemiology to estimate disease prevalence. The equation relates four terms: the prevalence of disease, the proportion of positive tests, test sensitivity and test specificity. When applied to a laboratory utilization audit, the formula estimates the prevalence of disease (pretest probability [PTP] in the patients tested. The comparison of PTPs among different providers, provider groups, or patient cohorts produces an objective evaluation of laboratory requests. We demonstrate the model in a review of tests for enterovirus (EV meningitis. Results: The model identified subpopulations within the cohort with a low prevalence of disease. These low prevalence groups shared demographic and seasonal factors known to protect against EV meningitis. This suggests too many orders occurred from patients at low risk for EV. Conclusion: We introduce a new method for laboratory utilization management programs to audit laboratory services.

  17. Comparison of Bayesian-utilitarian and maximin principle approaches.

    Comba, Pietro; Martuzzi, Marco; Botti, Caterina


    The Precautionary Principle implies the adoption of a set of rules aimed at avoiding possible future harm associated with suspected, but not ascertained, risk factors. Several philosophical, economical and societal questions are implied by precaution-based public health decision making. The purpose of the present paper is to specify the scope of the principle examining the notion of uncertainty involved, and the implication of different approaches to the decision-making process. The Bayesian-utilitarian approach and the approach based on the maximin principle will be considered, and the different meaning of prudence in the two settings will be discussed. In the Bayesian-utilitarian approach the small number of attributable cases will end up in a low average expected value, easily regarded as acceptable in a cost-benefit analysis. In a maximin approach, on the other hand, the issue will be to consider the high etiologic fraction of a rare disease in the highest category of exposure. In the light of the aforementioned cautions in interpretation, the core difference between the two approaches has to do with the choice between averaging knowledge or equitably distributing technological risks. PMID:15212224

  18. Remotely Sensed Monitoring of Small Reservoir Dynamics: A Bayesian Approach

    Dirk Eilander


    Full Text Available Multipurpose small reservoirs are important for livelihoods in rural semi-arid regions. To manage and plan these reservoirs and to assess their hydrological impact at a river basin scale, it is important to monitor their water storage dynamics. This paper introduces a Bayesian approach for monitoring small reservoirs with radar satellite images. The newly developed growing Bayesian classifier has a high degree of automation, can readily be extended with auxiliary information and reduces the confusion error to the land-water boundary pixels. A case study has been performed in the Upper East Region of Ghana, based on Radarsat-2 data from November 2012 until April 2013. Results show that the growing Bayesian classifier can deal with the spatial and temporal variability in synthetic aperture radar (SAR backscatter intensities from small reservoirs. Due to its ability to incorporate auxiliary information, the algorithm is able to delineate open water from SAR imagery with a low land-water contrast in the case of wind-induced Bragg scattering or limited vegetation on the land surrounding a small reservoir.

  19. SAR imaging via iterative adaptive approach and sparse Bayesian learning

    Xue, Ming; Santiago, Enrique; Sedehi, Matteo; Tan, Xing; Li, Jian


    We consider sidelobe reduction and resolution enhancement in synthetic aperture radar (SAR) imaging via an iterative adaptive approach (IAA) and a sparse Bayesian learning (SBL) method. The nonparametric weighted least squares based IAA algorithm is a robust and user parameter-free adaptive approach originally proposed for array processing. We show that it can be used to form enhanced SAR images as well. SBL has been used as a sparse signal recovery algorithm for compressed sensing. It has been shown in the literature that SBL is easy to use and can recover sparse signals more accurately than the l 1 based optimization approaches, which require delicate choice of the user parameter. We consider using a modified expectation maximization (EM) based SBL algorithm, referred to as SBL-1, which is based on a three-stage hierarchical Bayesian model. SBL-1 is not only more accurate than benchmark SBL algorithms, but also converges faster. SBL-1 is used to further enhance the resolution of the SAR images formed by IAA. Both IAA and SBL-1 are shown to be effective, requiring only a limited number of iterations, and have no need for polar-to-Cartesian interpolation of the SAR collected data. This paper characterizes the achievable performance of these two approaches by processing the complex backscatter data from both a sparse case study and a backhoe vehicle in free space with different aperture sizes.

  20. A Bayesian experimental design approach to structural health monitoring

    Farrar, Charles [Los Alamos National Laboratory; Flynn, Eric [UCSD; Todd, Michael [UCSD


    Optimal system design for SHM involves two primarily challenges. The first is the derivation of a proper performance function for a given system design. The second is the development of an efficient optimization algorithm for choosing a design that maximizes, or nearly maximizes the performance function. In this paper we will outline how an SHM practitioner can construct the proper performance function by casting the entire design problem into a framework of Bayesian experimental design. The approach demonstrates how the design problem necessarily ties together all steps of the SHM process.

  1. The subjectivity of scientists and the Bayesian statistical approach

    Press, James S


    Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a

  2. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution--normalized emission intensity of the spatial poisson process--is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on Rk (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM

  3. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    Barat, Éric; Dautremer, Thomas


    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution—normalized emission intensity of the spatial poisson process—is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on Rk (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM.

  4. A novel Bayesian approach to spectral function reconstruction

    Burnier, Yannis


    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the MEM. We present a realistic test of our method in the context of the non-perturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. An improved potential estimation from previously investigated quenched lattice QCD correlators is provided.

  5. A Bayesian Approach to the Partitioning of Workflows

    Chua, Freddy C


    When partitioning workflows in realistic scenarios, the knowledge of the processing units is often vague or unknown. A naive approach to addressing this issue is to perform many controlled experiments for different workloads, each consisting of multiple number of trials in order to estimate the mean and variance of the specific workload. Since this controlled experimental approach can be quite costly in terms of time and resources, we propose a variant of the Gibbs Sampling algorithm that uses a sequence of Bayesian inference updates to estimate the processing characteristics of the processing units. Using the inferred characteristics of the processing units, we are able to determine the best way to split a workflow for processing it in parallel with the lowest expected completion time and least variance.

  6. Detecting Threat E-mails using Bayesian Approach

    Banday, M Tariq; Jan, Tariq R; Shah, Nisar A


    Fraud and terrorism have a close connect in terms of the processes that enables and promote them. In the era of Internet, its various services that include Web, e-mail, social networks, blogs, instant messaging, chats, etc. are used in terrorism not only for communication but also for i) creation of ideology, ii) resource gathering, iii) recruitment, indoctrination and training, iv) creation of terror network, and v) information gathering. A major challenge for law enforcement and intelligence agencies is efficient and accurate gathering of relevant and growing volume of crime data. This paper reports on use of established Na\\"ive Bayesian filter for classification of threat e-mails. Efficiency in filtering threat e-mail by use of three different Na\\"ive Bayesian filter approaches i.e. single keywords, weighted multiple keywords and weighted multiple keywords with keyword context matching are evaluated on a threat e-mail corpus created by extracting data from sources that are very close to terrorism.

  7. Bayesian approach in MN low dose of radiation counting

    The Micronucleus assay in lymphocytes is a well established technique for the assessment of genetic damage induced by ionizing radiation. Due to the presence of a natural background of MN the net MN is obtained by subtracting this value to the gross value. When very low doses of radiation are given the induced MN is close even lower than the predetermined background value. Furthermore, the damage distribution induced by the radiation follows a Poisson probability distribution. These two facts pose a difficult task to obtain the net counting rate in the exposed situations. It is possible to overcome this problem using a bayesian approach, in which the selection of a priori distributions for the background and net counting rate plays an important role. In the present work we make a detailed analysed using bayesian theory to infer the net counting rate in two different situations: a) when the background is known for an individual sample, using exact value value for the background and Jeffreys prior for the net counting rate, and b) when the background is not known and we make use of a population background distribution as background prior function and constant prior for the net counting rate. (Author)

  8. A Bayesian, exemplar-based approach to hierarchical shape matching.

    Gavrila, Dariu M


    This paper presents a novel probabilistic approach to hierarchical, exemplar-based shape matching. No feature correspondence is needed among exemplars, just a suitable pairwise similarity measure. The approach uses a template tree to efficiently represent and match the variety of shape exemplars. The tree is generated offline by a bottom-up clustering approach using stochastic optimization. Online matching involves a simultaneous coarse-to-fine approach over the template tree and over the transformation parameters. The main contribution of this paper is a Bayesian model to estimate the a posteriori probability of the object class, after a certain match at a node of the tree. This model takes into account object scale and saliency and allows for a principled setting of the matching thresholds such that unpromising paths in the tree traversal process are eliminated early on. The proposed approach was tested in a variety of application domains. Here, results are presented on one of the more challenging domains: real-time pedestrian detection from a moving vehicle. A significant speed-up is obtained when comparing the proposed probabilistic matching approach with a manually tuned nonprobabilistic variant, both utilizing the same template tree structure. PMID:17568144

  9. Bayesian approach to the detection problem in gravitational wave astronomy

    The analysis of data from gravitational wave detectors can be divided into three phases: search, characterization, and evaluation. The evaluation of the detection--determining whether a candidate event is astrophysical in origin or some artifact created by instrument noise--is a crucial step in the analysis. The ongoing analyses of data from ground-based detectors employ a frequentist approach to the detection problem. A detection statistic is chosen, for which background levels and detection efficiencies are estimated from Monte Carlo studies. This approach frames the detection problem in terms of an infinite collection of trials, with the actual measurement corresponding to some realization of this hypothetical set. Here we explore an alternative, Bayesian approach to the detection problem, that considers prior information and the actual data in hand. Our particular focus is on the computational techniques used to implement the Bayesian analysis. We find that the parallel tempered Markov chain Monte Carlo (PTMCMC) algorithm is able to address all three phases of the analysis in a coherent framework. The signals are found by locating the posterior modes, the model parameters are characterized by mapping out the joint posterior distribution, and finally, the model evidence is computed by thermodynamic integration. As a demonstration, we consider the detection problem of selecting between models describing the data as instrument noise, or instrument noise plus the signal from a single compact galactic binary. The evidence ratios, or Bayes factors, computed by the PTMCMC algorithm are found to be in close agreement with those computed using a reversible jump Markov chain Monte Carlo algorithm.

  10. A Robust Obstacle Avoidance for Service Robot Using Bayesian Approach

    Widodo Budiharto


    Full Text Available The objective of this paper is to propose a robust obstacle avoidance method for service robot in indoor environment. The method for obstacles avoidance uses information about static obstacles on the landmark using edge detection. Speed and direction of people that walks as moving obstacle obtained by single camera using tracking and recognition system and distance measurement using 3 ultrasonic sensors. A new geometrical model and maneuvering method for moving obstacle avoidance introduced and combined with Bayesian approach for state estimation. The obstacle avoidance problem is formulated using decision theory, prior and posterior distribution and loss function to determine an optimal response based on inaccurate sensor data. Algorithms for moving obstacles avoidance method proposed and experiment results implemented to service robot also presented. Various experiments show that our proposed method very fast, robust and successfully implemented to service robot called Srikandi II that equipped with 4 DOF arm robot developed in our laboratory.

  11. A Bayesian Approach to Detection of Small Low Emission Sources

    Xun, Xiaolei; Carroll, Raymond J; Kuchment, Peter


    The article addresses the problem of detecting presence and location of a small low emission source inside of an object, when the background noise dominates. This problem arises, for instance, in some homeland security applications. The goal is to reach the signal-to-noise ratio (SNR) levels on the order of $10^{-3}$. A Bayesian approach to this problem is implemented in 2D. The method allows inference not only about the existence of the source, but also about its location. We derive Bayes factors for model selection and estimation of location based on Markov Chain Monte Carlo (MCMC) simulation. A simulation study shows that with sufficiently high total emission level, our method can effectively locate the source.

  12. A Bayesian approach to the modelling of alpha Cen A

    Bazot, M; Christensen-Dalsgaard, J


    Determining the physical characteristics of a star is an inverse problem consisting in estimating the parameters of models for the stellar structure and evolution, knowing certain observable quantities. We use a Bayesian approach to solve this problem for alpha Cen A, which allows us to incorporate prior information on the parameters to be estimated, in order to better constrain the problem. Our strategy is based on the use of a Markov Chain Monte Carlo (MCMC) algorithm to estimate the posterior probability densities of the stellar parameters: mass, age, initial chemical composition,... We use the stellar evolutionary code ASTEC to model the star. To constrain this model both seismic and non-seismic observations were considered. Several different strategies were tested to fit these values, either using two or five free parameters in ASTEC. We are thus able to show evidence that MCMC methods become efficient with respect to more classical grid-based strategies when the number of parameters increases. The resul...

  13. Multivariate meta-analysis of mixed outcomes: a Bayesian approach.

    Bujkiewicz, Sylwia; Thompson, John R; Sutton, Alex J; Cooper, Nicola J; Harrison, Mark J; Symmons, Deborah P M; Abrams, Keith R


    Multivariate random effects meta-analysis (MRMA) is an appropriate way for synthesizing data from studies reporting multiple correlated outcomes. In a Bayesian framework, it has great potential for integrating evidence from a variety of sources. In this paper, we propose a Bayesian model for MRMA of mixed outcomes, which extends previously developed bivariate models to the trivariate case and also allows for combination of multiple outcomes that are both continuous and binary. We have constructed informative prior distributions for the correlations by using external evidence. Prior distributions for the within-study correlations were constructed by employing external individual patent data and using a double bootstrap method to obtain the correlations between mixed outcomes. The between-study model of MRMA was parameterized in the form of a product of a series of univariate conditional normal distributions. This allowed us to place explicit prior distributions on the between-study correlations, which were constructed using external summary data. Traditionally, independent 'vague' prior distributions are placed on all parameters of the model. In contrast to this approach, we constructed prior distributions for the between-study model parameters in a way that takes into account the inter-relationship between them. This is a flexible method that can be extended to incorporate mixed outcomes other than continuous and binary and beyond the trivariate case. We have applied this model to a motivating example in rheumatoid arthritis with the aim of incorporating all available evidence in the synthesis and potentially reducing uncertainty around the estimate of interest. PMID:23630081

  14. A Bayesian approach to spectral quantitative photoacoustic tomography

    A Bayesian approach to the optical reconstruction problem associated with spectral quantitative photoacoustic tomography is presented. The approach is derived for commonly used spectral tissue models of optical absorption and scattering: the absorption is described as a weighted sum of absorption spectra of known chromophores (spatially dependent chromophore concentrations), while the scattering is described using Mie scattering theory, with the proportionality constant and spectral power law parameter both spatially-dependent. It is validated using two-dimensional test problems composed of three biologically relevant chromophores: fat, oxygenated blood and deoxygenated blood. Using this approach it is possible to estimate the Grüneisen parameter, the absolute chromophore concentrations, and the Mie scattering parameters associated with spectral photoacoustic tomography problems. In addition, the direct estimation of the spectral parameters is compared to estimates obtained by fitting the spectral parameters to estimates of absorption, scattering and Grüneisen parameter at the investigated wavelengths. It is shown with numerical examples that the direct estimation results in better accuracy of the estimated parameters. (papers)

  15. AutoClass: A Bayesian Approach to Classification

    Stutz, John; Cheeseman, Peter; Hanson, Robin; Taylor, Will; Lum, Henry, Jr. (Technical Monitor)


    We describe a Bayesian approach to the untutored discovery of classes in a set of cases, sometimes called finite mixture separation or clustering. The main difference between clustering and our approach is that we search for the "best" set of class descriptions rather than grouping the cases themselves. We describe our classes in terms of a probability distribution or density function, and the locally maximal posterior probability valued function parameters. We rate our classifications with an approximate joint probability of the data and functional form, marginalizing over the parameters. Approximation is necessitated by the computational complexity of the joint probability. Thus, we marginalize w.r.t. local maxima in the parameter space. We discuss the rationale behind our approach to classification. We give the mathematical development for the basic mixture model and describe the approximations needed for computational tractability. We instantiate the basic model with the discrete Dirichlet distribution and multivariant Gaussian density likelihoods. Then we show some results for both constructed and actual data.

  16. Point and Interval Estimation on the Degree and the Angle of Polarization. A Bayesian approach

    Maier, Daniel; Santangelo, Andrea


    Linear polarization measurements provide access to two quantities, the degree (DOP) and the angle of polarization (AOP). The aim of this work is to give a complete and concise overview of how to analyze polarimetric measurements. We review interval estimations for the DOP with a frequentist and a Bayesian approach. Point estimations for the DOP and interval estimations for the AOP are further investigated with a Bayesian approach to match observational needs. Point and interval estimations are calculated numerically for frequentist and Bayesian statistics. Monte Carlo simulations are performed to clarify the meaning of the calculations. Under observational conditions, the true DOP and AOP are unknown, so that classical statistical considerations - based on true values - are not directly usable. In contrast, Bayesian statistics handles unknown true values very well and produces point and interval estimations for DOP and AOP, directly. Using a Bayesian approach, we show how to choose DOP point estimations based...

  17. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    Marwala, Tshilidzi


    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  18. Modelling of population dynamics of red king crab using Bayesian approach

    Bakanev Sergey ...


    Modeling population dynamics based on the Bayesian approach enables to successfully resolve the above issues. The integration of the data from various studies into a unified model based on Bayesian parameter estimation method provides a much more detailed description of the processes occurring in the population.

  19. Super-resolution in cardiac MRI using a Bayesian approach

    Velasco Toledo, Nelson; Rueda, Andrea; Santa Marta, Cristina; Romero, Eduardo


    Acquisition of proper cardiac MR images is highly limited by continued heart motion and apnea periods. A typical acquisition results in volumes with inter-slice separations of up to 8 mm. This paper presents a super-resolution strategy that estimates a high-resolution image from a set of low-resolution image series acquired in different non-orthogonal orientations. The proposal is based on a Bayesian approach that implements a Maximum a Posteriori (MAP) estimator combined with a Wiener filter. A pre-processing stage was also included, to correct or eliminate differences in the image intensities and to transform the low-resolution images to a common spatial reference system. The MAP estimation includes an observation image model that represents the different contributions to the voxel intensities based on a 3D Gaussian function. A quantitative and qualitative assessment was performed using synthetic and real images, showing that the proposed approach produces a high-resolution image with significant improvements (about 3dB in PSNR) with respect to a simple trilinear interpolation. The Wiener filter shows little contribution to the final result, demonstrating that the MAP uniformity prior is able to filter out a large amount of the acquisition noise.

  20. A Bayesian approach to extracting meaning from system behavior

    Dress, W.B.


    The modeling relation and its reformulation to include the semiotic hierarchy is essential for the understanding, control, and successful re-creation of natural systems. This presentation will argue for a careful application of Rosen`s modeling relationship to the problems of intelligence and autonomy in natural and artificial systems. To this end, the authors discuss the essential need for a correct theory of induction, learning, and probability; and suggest that modern Bayesian probability theory, developed by Cox, Jaynes, and others, can adequately meet such demands, especially on the operational level of extracting meaning from observations. The methods of Bayesian and maximum Entropy parameter estimation have been applied to measurements of system observables to directly infer the underlying differential equations generating system behavior. This approach by-passes the usual method of parameter estimation based on assuming a functional form for the observable and then estimating the parameters that would lead to the particular observed behavior. The computational savings is great since only location parameters enter into the maximum-entropy calculations; this innovation finesses the need for nonlinear parameters altogether. Such an approach more directly extracts the semantics inherent in a given system by going to the root of system meaning as expressed by abstract form or shape, rather than in syntactic particulars, such as signal amplitude and phase. Examples will be shown how the form of a system can be followed while ignoring unnecessary details. In this sense, the authors are observing the meaning of the words rather than being concerned with their particular expression or language. For the present discussion, empirical models are embodied by the differential equations underlying, producing, or describing the behavior of a process as measured or tracked by a particular variable set--the observables. The a priori models are probability structures that

  1. A Bayesian decision approach to rainfall thresholds based flood warning

    M. L. V. Martina


    Full Text Available Operational real time flood forecasting systems generally require a hydrological model to run in real time as well as a series of hydro-informatics tools to transform the flood forecast into relatively simple and clear messages to the decision makers involved in flood defense. The scope of this paper is to set forth the possibility of providing flood warnings at given river sections based on the direct comparison of the quantitative precipitation forecast with critical rainfall threshold values, without the need of an on-line real time forecasting system. This approach leads to an extremely simplified alert system to be used by non technical stakeholders and could also be used to supplement the traditional flood forecasting systems in case of system failures. The critical rainfall threshold values, incorporating the soil moisture initial conditions, result from statistical analyses using long hydrological time series combined with a Bayesian utility function minimization. In the paper, results of an application of the proposed methodology to the Sieve river, a tributary of the Arno river in Italy, are given to exemplify its practical applicability.

  2. Defining statistical perceptions with an empirical Bayesian approach

    Tajima, Satohiro


    Extracting statistical structures (including textures or contrasts) from a natural stimulus is a central challenge in both biological and engineering contexts. This study interprets the process of statistical recognition in terms of hyperparameter estimations and free-energy minimization procedures with an empirical Bayesian approach. This mathematical interpretation resulted in a framework for relating physiological insights in animal sensory systems to the functional properties of recognizing stimulus statistics. We applied the present theoretical framework to two typical models of natural images that are encoded by a population of simulated retinal neurons, and demonstrated that the resulting cognitive performances could be quantified with the Fisher information measure. The current enterprise yielded predictions about the properties of human texture perception, suggesting that the perceptual resolution of image statistics depends on visual field angles, internal noise, and neuronal information processing pathways, such as the magnocellular, parvocellular, and koniocellular systems. Furthermore, the two conceptually similar natural-image models were found to yield qualitatively different predictions, striking a note of warning against confusing the two models when describing a natural image.

  3. A new Bayesian approach to the reconstruction of spectral functions

    Burnier, Yannis


    We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function $m(\\omega)$ only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter $\\alpha$ in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of $P[\\rho|D]$ in the full $N_\\omega\\gg N_\\tau$ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width ...

  4. An agglomerative hierarchical approach to visualization in Bayesian clustering problems.

    Dawson, K J; Belkhir, K


    Clustering problems (including the clustering of individuals into outcrossing populations, hybrid generations, full-sib families and selfing lines) have recently received much attention in population genetics. In these clustering problems, the parameter of interest is a partition of the set of sampled individuals--the sample partition. In a fully Bayesian approach to clustering problems of this type, our knowledge about the sample partition is represented by a probability distribution on the space of possible sample partitions. As the number of possible partitions grows very rapidly with the sample size, we cannot visualize this probability distribution in its entirety, unless the sample is very small. As a solution to this visualization problem, we recommend using an agglomerative hierarchical clustering algorithm, which we call the exact linkage algorithm. This algorithm is a special case of the maximin clustering algorithm that we introduced previously. The exact linkage algorithm is now implemented in our software package PartitionView. The exact linkage algorithm takes the posterior co-assignment probabilities as input and yields as output a rooted binary tree, or more generally, a forest of such trees. Each node of this forest defines a set of individuals, and the node height is the posterior co-assignment probability of this set. This provides a useful visual representation of the uncertainty associated with the assignment of individuals to categories. It is also a useful starting point for a more detailed exploration of the posterior distribution in terms of the co-assignment probabilities. PMID:19337306

  5. A Bayesian Approach to Identifying New Risk Factors for Dementia

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua


    Abstract Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics. We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular...

  6. Evolution of Subjective Hurricane Risk Perceptions: A Bayesian Approach

    David Kelly; David Letson; Forest Nelson; Nolan, David S.; Daniel Solis


    This paper studies how individuals update subjective risk perceptions in response to hurricane track forecast information, using a unique data set from an event market, the Hurricane Futures Market (HFM). We derive a theoretical Bayesian framework which predicts how traders update their perceptions of the probability of a hurricane making landfall in a certain range of coastline. Our results suggest that traders behave in a way consistent with Bayesian updating but this behavior is based on t...

  7. Modelling biogeochemical cycles in forest ecosystems: a Bayesian approach

    Bagnara, Maurizio


    Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different t...

  8. Estimation of the correlation coefficient using the Bayesian Approach and its applications for epidemiologic research

    England Lucinda J; Moysich Kirsten B; Schisterman Enrique F; Rao Malla


    Abstract Background The Bayesian approach is one alternative for estimating correlation coefficients in which knowledge from previous studies is incorporated to improve estimation. The purpose of this paper is to illustrate the utility of the Bayesian approach for estimating correlations using prior knowledge. Methods The use of the hyperbolic tangent transformation (ρ = tanh ξ and r = tanh z) enables the investigator to take advantage of the conjugate properties of the normal distribution, w...

  9. Prediction of road accidents: A Bayesian hierarchical approach

    Deublein, Markus; Schubert, Matthias; Adey, Bryan T.;


    -lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks...... in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models.Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis...... of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions...

  10. Uncertainty analysis using Beta-Bayesian approach in nuclear safety code validation

    Highlights: • To meet the 95/95 criterion, the Wilks’ method is identical to the Bayesian approach. • A prior selection in Bayesian approach is of strong influenced on the code run times. • It is possible to utilize prior experience to reduce code runs to meet the 95/95 criterion. • The variation of the probability for each code runs is provided. - Abstract: Since best-estimate plus uncertainty analysis was approved by Nuclear Regulatory Commission for nuclear reactor safety evaluation, several uncertainty assessment methods have been proposed and applied in the framework of best-estimate code validation in nuclear industry. Among them, the Wilks’ method and Bayesian approach are the two most popular statistical methods for uncertainty quantification. This study explores the inherent relation between the two methods using the Beta distribution function as the prior in the Bayesian analysis. Subsequently, the Wilks’ method can be considered as a special case of Beta-Bayesian approach, equivalent to the conservative case with Wallis’ “pessimistic” prior in the Bayesian analysis. However, the results do depend on the choice of the pessimistic prior function forms. The analysis of mean and variance through Beta-Bayesian approach provides insight into the Wilks’ 95/95 results with different orders. It indicates that the 95/95 results of Wilks’ method become more accurate and more precise with the increasing of the order. Furthermore, Bayesian updating process is well demonstrated in the code validation practice. The selection of updating prior can make use of the current experience of the code failure and success statistics, so as to effectively predict further needed number of numerical simulations to reach the 95/95 criterion

  11. Genetic evaluation of popcorn families using a Bayesian approach via the independence chain algorithm

    Marcos Rodovalho


    Full Text Available The objective of this study was to examine genetic parameters of popping expansion and grain yield in a trial of 169 halfsib families using a Bayesian approach. The independence chain algorithm with informative priors for the components of residual and family variance (inverse-gamma prior distribution was used. Popping expansion was found to be moderately heritable, with a posterior mode of h2 of 0.34, and 90% Bayesian confidence interval of 0.22 to 0.44. The heritability of grain yield (family level was moderate (h2 = 0.4 with Bayesian confidence interval of 0.28 to 0.49. The target population contains sufficient genetic variability for subsequent breeding cycles, and the Bayesian approach is a useful alternative for scientific inference in the genetic evaluation of popcorn.

  12. Nursing Home Care Quality: Insights from a Bayesian Network Approach

    Goodson, Justin; Jang, Wooseung; Rantz, Marilyn


    Purpose: The purpose of this research is twofold. The first purpose is to utilize a new methodology (Bayesian networks) for aggregating various quality indicators to measure the overall quality of care in nursing homes. The second is to provide new insight into the relationships that exist among various measures of quality and how such measures…

  13. A Bayesian Approach to Identifying New Risk Factors for Dementia

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua


    Abstract Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics. We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database. We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia. We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients. PMID:27227925

  14. Bayesian probabilistic network approach for managing earthquake risks of cities

    Bayraktarli, Yahya; Faber, Michael


    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models and...... geographical information systems. The proposed framework comprises several modules: A module on the probabilistic description of potential future earthquake shaking intensity, a module on the probabilistic assessment of spatial variability of soil liquefaction, a module on damage assessment of buildings and a...... fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated on...

  15. An algeraic approach to structural learning Bayesian networks

    Studený, Milan

    Paris: Editions EDK, 2006 - (Bouchon-Meunier, B.; Yager, R.), s. 2284-2291 ISBN 2-84254-112-X. [IMPU 2006. Paris (FR), 02.07.2006-07.07.2006] R&D Projects: GA ČR GA201/04/0393 Institutional research plan: CEZ:AV0Z10750506 Keywords : learning Bayesian networks * standard imset * data vector Subject RIV: BA - General Mathematics

  16. A Quasirandom Approach to Integration in Bayesian Statistics

    Shaw, J. E. H.


    Practical Bayesian statistics with realistic models usually gives posterior distributions that are analytically intractable, and inferences must be made via numerical integration. In many cases, the integrands can be transformed into periodic functions on the unit $d$-dimensional cube, for which quasirandom sequences are known to give efficient numerical integration rules. This paper reviews some relevant theory, defines new criteria for identifying suitable quasirandom sequences and suggests...

  17. Regional fertility data analysis: A small area Bayesian approach

    Eduardo A. Castro; Zhen Zhang; Arnab Bhattacharjee; Martins, José M.; Taps Maiti


    Accurate estimation of demographic variables such as mortality, fertility and migrations, by age groups and regions, is important for analyses and policy. However, traditional estimates based on within cohort counts are often inaccurate, particularly when the sub-populations considered are small. We use small area Bayesian statistics to develop a model for age-specific fertility rates. In turn, such small area estimation requires accurate descriptions of spatial and cross-section dependence. ...

  18. A General and Flexible Approach to Estimating the Social Relations Model Using Bayesian Methods

    Ludtke, Oliver; Robitzsch, Alexander; Kenny, David A.; Trautwein, Ulrich


    The social relations model (SRM) is a conceptual, methodological, and analytical approach that is widely used to examine dyadic behaviors and interpersonal perception within groups. This article introduces a general and flexible approach to estimating the parameters of the SRM that is based on Bayesian methods using Markov chain Monte Carlo…

  19. Comparison between the basic least squares and the Bayesian approach for elastic constants identification

    Gogu, C; Le Riche, R; Molimard, J; Vautrin, A [Ecole des Mines de Saint Etienne, 158 cours Fauriel, 42023 Saint Etienne (France); Haftka, R; Sankar, B [University of Florida, PO Box 116250, Gainesville, FL, 32611 (United States)], E-mail:


    The basic formulation of the least squares method, based on the L{sub 2} norm of the misfit, is still widely used today for identifying elastic material properties from experimental data. An alternative statistical approach is the Bayesian method. We seek here situations with significant difference between the material properties found by the two methods. For a simple three bar truss example we illustrate three such situations in which the Bayesian approach leads to more accurate results: different magnitude of the measurements, different uncertainty in the measurements and correlation among measurements. When all three effects add up, the Bayesian approach can have a large advantage. We then compared the two methods for identification of elastic constants from plate vibration natural frequencies.

  20. [Contribution of computers to pharmacokinetics, Bayesian approach and population pharmacokinetics].

    Hecquet, B


    A major objective for pharmacokineticians is to help practicians to define drug administration protocols. Protocols are generally designed for all the patients but inter individual variability would need monitoring for each patient. Computers are widely used to determine pharmacokinetic parameters and to try to individualize drug administration. Severals examples are summarily described: terminal half-life determination by regression; model fitting to experimental data; Bayesian statistics for individual dose adaptation; population pharmacokinetic methods for parameter evaluation. These methods do not replace the pharmacokinetician thought but could make possible drug administration taking into account individual characteristics. PMID:8680074


    Widodo Budiharto


    In this study we propose a model of an Expert System to diagnose a car failure and malfunction using Bayesian Approach. An expert car failure diagnosis system is a computer system that uses specific knowledge which is owned by an expert to resolve car problems. Our specific system consists of knowledge base and solution to diagnose failure of car from Toyota Avanza, one of the favorite car used in Indonesia today and applying Bayesian approach for knowing the belief of the solution. We build ...

  2. A Bayesian approach to matched field processing in uncertain ocean environments

    LI Jianlong; PAN Xiang


    An approach of Bayesian Matched Field Processing(MFP)was discussed in the uncertain ocean environment.In this approach,uncertainty knowledge is modeled and spatial and temporal data Received by the array are fully used.Therefore,a mechanism for MFP is found.which well combines model-based and data-driven methods of uncertain field processing.By theoretical derivation,simulation analysis and the validation of the experimental array data at sea,we find that(1)the basic components of Bayesian matched field processors are the corresponding sets of Bartlett matched field processor,MVDR(minimum variance distortionless response)matched field processor,etc.;(2)Bayesian MVDR/Bartlett MFP are the weighted sum of the MVDR/Bartlett MFP,where the weighted coefficients are the values of the a posteriori probability;(3)with the uncertain ocean environment,Bayesian MFP can more correctly locate the source than MVDR MFP or Bartlett MFP;(4)Bayesian MFP call better suppress sidelobes of the ambiguity surfaces.

  3. A Bayesian approach to linear regression in astronomy

    Sereno, Mauro


    Linear regression is common in astronomical analyses. I discuss a Bayesian hierarchical modeling of data with heteroscedastic and possibly correlated measurement errors and intrinsic scatter. The method fully accounts for time evolution. The slope, the normalization, and the intrinsic scatter of the relation can evolve with the redshift. The intrinsic distribution of the independent variable is approximated using a mixture of Gaussian distributions whose means and standard deviations depend on time. The method can address scatter in the measured independent variable (a kind of Eddington bias), selection effects in the response variable (Malmquist bias), and departure from linearity in form of a knee. I tested the method with toy models and simulations and quantified the effect of biases and inefficient modeling. The R-package LIRA (LInear Regression in Astronomy) is made available to perform the regression.

  4. A Bayesian Nonparametric Approach to Image Super-Resolution.

    Polatkan, Gungor; Zhou, Mingyuan; Carin, Lawrence; Blei, David; Daubechies, Ingrid


    Super-resolution methods form high-resolution images from low-resolution images. In this paper, we develop a new Bayesian nonparametric model for super-resolution. Our method uses a beta-Bernoulli process to learn a set of recurring visual patterns, called dictionary elements, from the data. Because it is nonparametric, the number of elements found is also determined from the data. We test the results on both benchmark and natural images, comparing with several other models from the research literature. We perform large-scale human evaluation experiments to assess the visual quality of the results. In a first implementation, we use Gibbs sampling to approximate the posterior. However, this algorithm is not feasible for large-scale data. To circumvent this, we then develop an online variational Bayes (VB) algorithm. This algorithm finds high quality dictionaries in a fraction of the time needed by the Gibbs sampler. PMID:26353246

  5. Inventory control of spare parts using a Bayesian approach: a case study

    K-P. Aronis; I. Magou (Ioulia); R. Dekker (Rommert); G. Tagaras (George)


    textabstractThis paper presents a case study of applying a Bayesian approach to forecast demand and subsequently determine the appropriate parameter S of an (S-1,S) inventory system for controlling spare parts of electronic equipment. First, the problem and the current policy are described. Then, t

  6. Equifinality of formal (DREAM) and informal (GLUE) bayesian approaches in hydrologic modeling?

    Vrugt, Jasper A [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Ter Braak, Cajo J F [NON LANL; Gupta, Hoshin V [NON LANL


    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.

  7. Genetic evaluation of popcorn families using a Bayesian approach via the independence chain algorithm

    Marcos Rodovalho; Freddy Mora; Osvin Arriagada; Carlos Maldonado; Emmanuel Arnhold; Carlos Alberto Scapim


    The objective of this study was to examine genetic parameters of popping expansion and grain yield in a trial of 169 halfsib families using a Bayesian approach. The independence chain algorithm with informative priors for the components of residual and family variance (inverse-gamma prior distribution) was used. Popping expansion was found to be moderately heritable, with a posterior mode of h2 of 0.34, and 90% Bayesian confidence interval of 0.22 to 0.44. The heritability of gra...

  8. A population-based Bayesian approach to the minimal model of glucose and insulin homeostasis

    Andersen, Kim Emil; Højbjerre, Malene


    for a whole population. Traditionally it has been analysed in a deterministic set-up with only error terms on the measurements. In this work we adopt a Bayesian graphical model to describe the coupled minimal model that accounts for both measurement and process variability, and the model is extended...... to a population-based model. The estimation of the parameters are efficiently implemented in a Bayesian approach where posterior inference is made through the use of Markov chain Monte Carlo techniques. Hereby we obtain a powerful and flexible modelling framework for regularizing the ill-posed estimation problem...

  9. New approach using Bayesian Network to improve content based image classification systems

    jayech, Khlifia


    This paper proposes a new approach based on augmented naive Bayes for image classification. Initially, each image is cutting in a whole of blocks. For each block, we compute a vector of descriptors. Then, we propose to carry out a classification of the vectors of descriptors to build a vector of labels for each image. Finally, we propose three variants of Bayesian Networks such as Naive Bayesian Network (NB), Tree Augmented Naive Bayes (TAN) and Forest Augmented Naive Bayes (FAN) to classify the image using the vector of labels. The results showed a marked improvement over the FAN, NB and TAN.

  10. Integer linear programming approach to learning Bayesian network structure: towards the essential graph

    Studený, Milan

    Granada : DESCAI, University of Granada, 2012, s. 307-314. ISBN 978-84-15536-57-4. [6th European Workshop on Probabilistic Graphical Models (PGM). Granada (ES), 19.09.2012-21.09.2012] R&D Projects: GA ČR GA201/08/0539 Institutional support: RVO:67985556 Keywords : learning Bayesian network structure * characteristic imset * essential graph Subject RIV: BA - General Mathematics linear programming approach to learning Bayesian network structure towards the essential graph.pdf

  11. Bayesian Belief Networks Approach for Modeling Irrigation Behavior

    Andriyas, S.; McKee, M.


    Canal operators need information to manage water deliveries to irrigators. Short-term irrigation demand forecasts can potentially valuable information for a canal operator who must manage an on-demand system. Such forecasts could be generated by using information about the decision-making processes of irrigators. Bayesian models of irrigation behavior can provide insight into the likely criteria which farmers use to make irrigation decisions. This paper develops a Bayesian belief network (BBN) to learn irrigation decision-making behavior of farmers and utilizes the resulting model to make forecasts of future irrigation decisions based on factor interaction and posterior probabilities. Models for studying irrigation behavior have been rarely explored in the past. The model discussed here was built from a combination of data about biotic, climatic, and edaphic conditions under which observed irrigation decisions were made. The paper includes a case study using data collected from the Canal B region of the Sevier River, near Delta, Utah. Alfalfa, barley and corn are the main crops of the location. The model has been tested with a portion of the data to affirm the model predictive capabilities. Irrigation rules were deduced in the process of learning and verified in the testing phase. It was found that most of the farmers used consistent rules throughout all years and across different types of crops. Soil moisture stress, which indicates the level of water available to the plant in the soil profile, was found to be one of the most significant likely driving forces for irrigation. Irrigations appeared to be triggered by a farmer's perception of soil stress, or by a perception of combined factors such as information about a neighbor irrigating or an apparent preference to irrigate on a weekend. Soil stress resulted in irrigation probabilities of 94.4% for alfalfa. With additional factors like weekend and irrigating when a neighbor irrigates, alfalfa irrigation

  12. Finding Clocks in Genes: A Bayesian Approach to Estimate Periodicity

    Yan Ren


    Full Text Available Identification of rhythmic gene expression from metabolic cycles to circadian rhythms is crucial for understanding the gene regulatory networks and functions of these biological processes. Recently, two algorithms, JTK_CYCLE and ARSER, have been developed to estimate periodicity of rhythmic gene expression. JTK_CYCLE performs well for long or less noisy time series, while ARSER performs well for detecting a single rhythmic category. However, observing gene expression at high temporal resolution is not always feasible, and many scientists are interested in exploring both ultradian and circadian rhythmic categories simultaneously. In this paper, a new algorithm, named autoregressive Bayesian spectral regression (ABSR, is proposed. It estimates the period of time-course experimental data and classifies gene expression profiles into multiple rhythmic categories simultaneously. Through the simulation studies, it is shown that ABSR substantially improves the accuracy of periodicity estimation and clustering of rhythmic categories as compared to JTK_CYCLE and ARSER for the data with low temporal resolution. Moreover, ABSR is insensitive to rhythmic patterns. This new scheme is applied to existing time-course mouse liver data to estimate period of rhythms and classify the genes into ultradian, circadian, and arrhythmic categories. It is observed that 49.2% of the circadian profiles detected by JTK_CYCLE with 1-hour resolution are also detected by ABSR with only 4-hour resolution.

  13. Bayesian Statistical Approach To Binary Asteroid Orbit Determination

    Dmitrievna Kovalenko, Irina; Stoica, Radu S.


    Orbit determination from observations is one of the classical problems in celestial mechanics. Deriving the trajectory of binary asteroid with high precision is much more complicate than the trajectory of simple asteroid. Here we present a method of orbit determination based on the algorithm of Monte Carlo Markov Chain (MCMC). This method can be used for the preliminary orbit determination with relatively small number of observations, or for adjustment of orbit previously determined.The problem consists on determination of a conditional a posteriori probability density with given observations. Applying the Bayesian statistics, the a posteriori probability density of the binary asteroid orbital parameters is proportional to the a priori and likelihood probability densities. The likelihood function is related to the noise probability density and can be calculated from O-C deviations (Observed minus Calculated positions). The optionally used a priori probability density takes into account information about the population of discovered asteroids. The a priori probability density is used to constrain the phase space of possible orbits.As a MCMC method the Metropolis-Hastings algorithm has been applied, adding a globally convergent coefficient. The sequence of possible orbits derives through the sampling of each orbital parameter and acceptance criteria.The method allows to determine the phase space of every possible orbit considering each parameter. It also can be used to derive one orbit with the biggest probability density of orbital elements.

  14. Sparsely sampling the sky: a Bayesian experimental design approach

    Paykari, P.; Jaffe, A. H.


    The next generation of galaxy surveys will observe millions of galaxies over large volumes of the Universe. These surveys are expensive both in time and cost, raising questions regarding the optimal investment of this time and money. In this work, we investigate criteria for selecting amongst observing strategies for constraining the galaxy power spectrum and a set of cosmological parameters. Depending on the parameters of interest, it may be more efficient to observe a larger, but sparsely sampled, area of sky instead of a smaller contiguous area. In this work, by making use of the principles of Bayesian experimental design, we will investigate the advantages and disadvantages of the sparse sampling of the sky and discuss the circumstances in which a sparse survey is indeed the most efficient strategy. For the Dark Energy Survey (DES), we find that by sparsely observing the same area in a smaller amount of time, we only increase the errors on the parameters by a maximum of 0.45 per cent. Conversely, investing the same amount of time as the original DES to observe a sparser but larger area of sky, we can in fact constrain the parameters with errors reduced by 28 per cent.

  15. A Comparison of Hierarchical and Non-Hierarchical Bayesian Approaches for Fitting Allometric Larch (Larix.spp. Biomass Equations

    Dongsheng Chen


    Full Text Available Accurate biomass estimations are important for assessing and monitoring forest carbon storage. Bayesian theory has been widely applied to tree biomass models. Recently, a hierarchical Bayesian approach has received increasing attention for improving biomass models. In this study, tree biomass data were obtained by sampling 310 trees from 209 permanent sample plots from larch plantations in six regions across China. Non-hierarchical and hierarchical Bayesian approaches were used to model allometric biomass equations. We found that the total, root, stem wood, stem bark, branch and foliage biomass model relationships were statistically significant (p-values < 0.001 for both the non-hierarchical and hierarchical Bayesian approaches, but the hierarchical Bayesian approach increased the goodness-of-fit statistics over the non-hierarchical Bayesian approach. The R2 values of the hierarchical approach were higher than those of the non-hierarchical approach by 0.008, 0.018, 0.020, 0.003, 0.088 and 0.116 for the total tree, root, stem wood, stem bark, branch and foliage models, respectively. The hierarchical Bayesian approach significantly improved the accuracy of the biomass model (except for the stem bark and can reflect regional differences by using random parameters to improve the regional scale model accuracy.

  16. True versus Apparent Malaria Infection Prevalence: The Contribution of a Bayesian Approach

    Speybroeck, Niko; Praet, Nicolas; Claes, Filip; Van Hong, Nguyen; Torres, Kathy; Mao, Sokny; Van den Eede, Peter; Ta Thi, Thinh; Gamboa, Dioni; Sochantha, Tho; Ngo Duc, Thang; Coosemans, Marc; Buescher, Philippe; D'Alessandro, Umberto; Berkvens, Dirk


    Aims: To present a new approach for estimating the "true prevalence'' of malaria and apply it to datasets from Peru, Vietnam, and Cambodia. Methods: Bayesian models were developed for estimating both the malaria prevalence using different diagnostic tests (microscopy, PCR & ELISA), without the need of a gold standard, and the tests' characteristics. Several sources of information, i.e. data, expert opinions and other sources of knowledge can be integrated into the model. This approach resulti...

  17. Bayesian Approach in Estimation of Scale Parameter of Nakagami Distribution

    Azam Zaka


    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE Nakagami distribution is a flexible life time distribution that may offer a good fit to some failure data sets. It has applications in attenuation of wireless signals traversing multiple paths, deriving unit hydrographs in hydrology, medical imaging studies etc. In this research, we obtain Bayesian estimators of the scale parameter of Nakagami distribution. For the posterior distribution of this parameter, we consider Uniform, Inverse Exponential and Levy priors. The three loss functions taken up are Squared Error Loss function, Quadratic Loss Function and Precautionary Loss function. The performance of an estimator is assessed on the basis of its relative posterior risk. Monte Carlo Simulations are used to compare the performance of the estimators. It is discovered that the PLF produces the least posterior risk when uniform priors is used. SELF is the best when inverse exponential and Levy Priors are used. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

  18. Bayesian approach to inverse problems for functions with a variable-index Besov prior

    Jia, Junxiong; Peng, Jigen; Gao, Jinghuai


    The Bayesian approach has been adopted to solve inverse problems that reconstruct a function from noisy observations. Prior measures play a key role in the Bayesian method. Hence, many probability measures have been proposed, among which total variation (TV) is a well-known prior measure that can preserve sharp edges. However, it has two drawbacks, the staircasing effect and a lack of the discretization-invariant property. The variable-index TV prior has been proposed and analyzed in the area of image analysis for the former, and the Besov prior has been employed recently for the latter. To overcome both issues together, in this paper, we present a variable-index Besov prior measure, which is a non-Gaussian measure. Some useful properties of this new prior measure have been proven for functions defined on a torus. We have also generalized Bayesian inverse theory in infinite dimensions for our new setting. Finally, this theory has been applied to integer- and fractional-order backward diffusion problems. To the best of our knowledge, this is the first time that the Bayesian approach has been used for the fractional-order backward diffusion problem, which provides an opportunity to quantify its uncertainties.

  19. A Bayesian approach to estimate sensible and latent heat over vegetation

    C. van der Tol


    Full Text Available Sensible and latent heat fluxes are often calculated from bulk transfer equations combined with the energy balance. For spatial estimates of these fluxes, a combination of remotely sensed and standard meteorological data from weather stations on the ground is used. The success of this approach depends on the accuracy of the input data and on the accuracy of two variables in particular: aerodynamic and surface conductance. This paper presents a Bayesian approach to improve estimates of sensible and latent heat fluxes by using a priori estimates of aerodynamic and surface conductance alongside remote measurements of surface temperature. The method is validated for time series of half-hourly measurements in a fully grown maize field, a vineyard and a forest. It is shown that the Bayesian approach yields more accurate estimates of sensible and latent heat flux than traditional methods.

  20. A Bayesian approach to estimate sensible and latent heat over vegetated land surface

    C. van der Tol


    Full Text Available Sensible and latent heat fluxes are often calculated from bulk transfer equations combined with the energy balance. For spatial estimates of these fluxes, a combination of remotely sensed and standard meteorological data from weather stations is used. The success of this approach depends on the accuracy of the input data and on the accuracy of two variables in particular: aerodynamic and surface conductance. This paper presents a Bayesian approach to improve estimates of sensible and latent heat fluxes by using a priori estimates of aerodynamic and surface conductance alongside remote measurements of surface temperature. The method is validated for time series of half-hourly measurements in a fully grown maize field, a vineyard and a forest. It is shown that the Bayesian approach yields more accurate estimates of sensible and latent heat flux than traditional methods.

  1. Bayesian approach to color-difference models based on threshold and constant-stimuli methods.

    Brusola, Fernando; Tortajada, Ignacio; Lengua, Ismael; Jordá, Begoña; Peris, Guillermo


    An alternative approach based on statistical Bayesian inference is presented to deal with the development of color-difference models and the precision of parameter estimation. The approach was applied to simulated data and real data, the latter published by selected authors involved with the development of color-difference formulae using traditional methods. Our results show very good agreement between the Bayesian and classical approaches. Among other benefits, our proposed methodology allows one to determine the marginal posterior distribution of each random individual parameter of the color-difference model. In this manner, it is possible to analyze the effect of individual parameters on the statistical significance calculation of a color-difference equation. PMID:26193510

  2. A New Method for E-Government Procurement Using Collaborative Filtering and Bayesian Approach

    Shuai Zhang


    Full Text Available Nowadays, as the Internet services increase faster than ever before, government systems are reinvented as E-government services. Therefore, government procurement sectors have to face challenges brought by the explosion of service information. This paper presents a novel method for E-government procurement (eGP to search for the optimal procurement scheme (OPS. Item-based collaborative filtering and Bayesian approach are used to evaluate and select the candidate services to get the top-M recommendations such that the involved computation load can be alleviated. A trapezoidal fuzzy number similarity algorithm is applied to support the item-based collaborative filtering and Bayesian approach, since some of the services’ attributes can be hardly expressed as certain and static values but only be easily represented as fuzzy values. A prototype system is built and validated with an illustrative example from eGP to confirm the feasibility of our approach.

  3. A new method for E-government procurement using collaborative filtering and Bayesian approach.

    Zhang, Shuai; Xi, Chengyu; Wang, Yan; Zhang, Wenyu; Chen, Yanhong


    Nowadays, as the Internet services increase faster than ever before, government systems are reinvented as E-government services. Therefore, government procurement sectors have to face challenges brought by the explosion of service information. This paper presents a novel method for E-government procurement (eGP) to search for the optimal procurement scheme (OPS). Item-based collaborative filtering and Bayesian approach are used to evaluate and select the candidate services to get the top-M recommendations such that the involved computation load can be alleviated. A trapezoidal fuzzy number similarity algorithm is applied to support the item-based collaborative filtering and Bayesian approach, since some of the services' attributes can be hardly expressed as certain and static values but only be easily represented as fuzzy values. A prototype system is built and validated with an illustrative example from eGP to confirm the feasibility of our approach. PMID:24385869

  4. Analyzing Rice distributed functional magnetic resonance imaging data: a Bayesian approach

    Analyzing functional MRI data is often a hard task due to the fact that these periodic signals are strongly disturbed with noise. In many cases, the signals are buried under the noise and not visible, such that detection is quite impossible. However, it is well known that the amplitude measurements of such disturbed signals follow a Rice distribution which is characterized by two parameters. In this paper, an alternative Bayesian approach is proposed to tackle this two-parameter estimation problem. By incorporating prior knowledge into a mathematical framework, the drawbacks of the existing methods (i.e. the maximum likelihood approach and the method of moments) can be overcome. The performance of the proposed Bayesian estimator is analyzed theoretically and illustrated through simulations. Finally, the developed approach is successfully applied to measurement data for the analysis of functional MRI

  5. A Gene Selection Algorithm using Bayesian Classification Approach

    Alok Sharma; Kuldip K. Paliwal


    In this study, we propose a new feature (or gene) selection algorithm using Bayes classification approach. The algorithm can find gene subset crucial for cancer classification problem. Problem statement: Gene identification plays important role in human cancer classification problem. Several feature selection algorithms have been proposed for analyzing and understanding influential genes using gene expression profiles. Approach: The feature selection algorithms aim to explore genes that are c...

  6. Constraining East Antarctic mass trends using a Bayesian inference approach

    Martin-Español, Alba; Bamber, Jonathan L.


    East Antarctica is an order of magnitude larger than its western neighbour and the Greenland ice sheet. It has the greatest potential to contribute to sea level rise of any source, including non-glacial contributors. It is, however, the most challenging ice mass to constrain because of a range of factors including the relative paucity of in-situ observations and the poor signal to noise ratio of Earth Observation data such as satellite altimetry and gravimetry. A recent study using satellite radar and laser altimetry (Zwally et al. 2015) concluded that the East Antarctic Ice Sheet (EAIS) had been accumulating mass at a rate of 136±28 Gt/yr for the period 2003-08. Here, we use a Bayesian hierarchical model, which has been tested on, and applied to, the whole of Antarctica, to investigate the impact of different assumptions regarding the origin of elevation changes of the EAIS. We combined GRACE, satellite laser and radar altimeter data and GPS measurements to solve simultaneously for surface processes (primarily surface mass balance, SMB), ice dynamics and glacio-isostatic adjustment over the period 2003-13. The hierarchical model partitions mass trends between SMB and ice dynamics based on physical principles and measures of statistical likelihood. Without imposing the division between these processes, the model apportions about a third of the mass trend to ice dynamics, +18 Gt/yr, and two thirds, +39 Gt/yr, to SMB. The total mass trend for that period for the EAIS was 57±20 Gt/yr. Over the period 2003-08, we obtain an ice dynamic trend of 12 Gt/yr and a SMB trend of 15 Gt/yr, with a total mass trend of 27 Gt/yr. We then imposed the condition that the surface mass balance is tightly constrained by the regional climate model RACMO2.3 and allowed height changes due to ice dynamics to occur in areas of low surface velocities (<10 m/yr) , such as those in the interior of East Antarctica (a similar condition as used in Zwally 2015). The model must find a solution that

  7. Bayesian approach for three-dimensional aquifer characterization at the hanford 300 area

    H. Murakami


    Full Text Available This study presents a stochastic, three-dimensional characterization of a heterogeneous hydraulic conductivity field within DOE's Hanford 300 Area site, Washington, by assimilating large-scale, constant-rate injection test data with small-scale, three-dimensional electromagnetic borehole flowmeter (EBF measurement data. We first inverted the injection test data to estimate the transmissivity field, using zeroth-order temporal moments of pressure buildup curves. We applied a newly developed Bayesian geostatistical inversion framework, the method of anchored distributions (MAD, to obtain a joint posterior distribution of geostatistical parameters and local log-transmissivities at multiple locations. The unique aspects of MAD that make it suitable for this purpose are its ability to integrate multi-scale, multi-type data within a Bayesian framework and to compute a nonparametric posterior distribution. After we combined the distribution of transmissivities with depth-discrete relative-conductivity profile from the EBF data, we inferred the three-dimensional geostatistical parameters of the log-conductivity field, using the Bayesian model-based geostatistics. Such consistent use of the Bayesian approach throughout the procedure enabled us to systematically incorporate data uncertainty into the final posterior distribution. The method was tested in a synthetic study and validated using the actual data that was not part of the estimation. Results showed broader and skewed posterior distributions of geostatistical parameters except for the mean, which suggests the importance of inferring the entire distribution to quantify the parameter uncertainty.

  8. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    Patri, Jean-François; Diard, Julien; Perrier, Pascal


    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way. PMID:26497359

  9. A Bayesian approach to PID in ALICE, as studied in the channel D0 → Kπ

    Wilkinson, Jeremy [Physikalisches Institut, University of Heidelberg, Im Neuenheimer Feld 226 (Germany)


    The standard particle identification (PID) method in ALICE uses a frequentist, or nσ, approach. In this method, particles are accepted via a selection on the number of standard deviations by which a signal differs from the expected detector response. This works well for most particle species, but in some cases, such as the non-resonant decay of Λ{sub c} → pKπ, the combinatorial background is too high to make a significant measurement. The usage of an alternative Bayesian approach, based on a combination of the measured dE/dx and time-of-flight of the daughter tracks, is presented here. The Bayesian method bases its response on prior probabilities of a given particle species being produced with a given momentum, allowing a calculation of the probability for a daughter particle to belong to each species (π, K, p, etc.) based on the detector response seen. In order to check the validity of this method, various decay channels were tested and compared to the nσ method. Among these was the channel D{sup 0} → Kπ, which provides a valuable cross-check of both the kaon and pion responses in this approach. In addition to considering the differences in signal-to-background ratio and significance obtained when applying the various methods, a comparison will be shown between the yield obtained without PID and those found when using Bayesian and nσ approaches after being corrected for their respective PID efficiencies.

  10. A Bayesian network approach to linear and nonlinear acoustic echo cancellation

    Huemmer, Christian; Maas, Roland; Hofmann, Christian; Kellermann, Walter


    This article provides a general Bayesian approach to the tasks of linear and nonlinear acoustic echo cancellation (AEC). We introduce a state-space model with latent state vector modeling all relevant information of the unknown system. Based on three cases for defining the state vector (to model a linear or nonlinear echo path) and its mathematical relation to the observation, it is shown that the normalized least mean square algorithm (with fixed and adaptive stepsize), the Hammerstein group model, and a numerical sampling scheme for nonlinear AEC can be derived by applying fundamental techniques for probabilistic graphical models. As a consequence, the major contribution of this Bayesian approach is a unifying graphical-model perspective which may serve as a powerful framework for future work in linear and nonlinear AEC.

  11. Bayesian-based estimation of acoustic surface impedance: Finite difference frequency domain approach.

    Bockman, Alexander; Fackler, Cameron; Xiang, Ning


    Acoustic performance for an interior requires an accurate description of the boundary materials' surface acoustic impedance. Analytical methods may be applied to a small class of test geometries, but inverse numerical methods provide greater flexibility. The parameter estimation problem requires minimizing prediction vice observed acoustic field pressure. The Bayesian-network sampling approach presented here mitigates other methods' susceptibility to noise inherent to the experiment, model, and numerics. A geometry agnostic method is developed here and its parameter estimation performance is demonstrated for an air-backed micro-perforated panel in an impedance tube. Good agreement is found with predictions from the ISO standard two-microphone, impedance-tube method, and a theoretical model for the material. Data by-products exclusive to a Bayesian approach are analyzed to assess sensitivity of the method to nuisance parameters. PMID:25920818

  12. Bayesian approach increases accuracy when selecting cowpea genotypes with high adaptability and phenotypic stability.

    Barroso, L M A; Teodoro, P E; Nascimento, M; Torres, F E; Dos Santos, A; Corrêa, A M; Sagrilo, E; Corrêa, C C G; Silva, F A; Ceccon, G


    This study aimed to verify that a Bayesian approach could be used for the selection of upright cowpea genotypes with high adaptability and phenotypic stability, and the study also evaluated the efficiency of using informative and minimally informative a priori distributions. Six trials were conducted in randomized blocks, and the grain yield of 17 upright cowpea genotypes was assessed. To represent the minimally informative a priori distributions, a probability distribution with high variance was used, and a meta-analysis concept was adopted to represent the informative a priori distributions. Bayes factors were used to conduct comparisons between the a priori distributions. The Bayesian approach was effective for selection of upright cowpea genotypes with high adaptability and phenotypic stability using the Eberhart and Russell method. Bayes factors indicated that the use of informative a priori distributions provided more accurate results than minimally informative a priori distributions. PMID:26985961


    Widodo Budiharto


    Full Text Available In this study we propose a model of an Expert System to diagnose a car failure and malfunction using Bayesian Approach. An expert car failure diagnosis system is a computer system that uses specific knowledge which is owned by an expert to resolve car problems. Our specific system consists of knowledge base and solution to diagnose failure of car from Toyota Avanza, one of the favorite car used in Indonesia today and applying Bayesian approach for knowing the belief of the solution. We build Knowledge representation techniques of symptoms and solution froman experts using production rules. The experimental results presented and we obtained that the system has been able to perform diagnosis on car failure, giving solution and also gives the probability value of that solution.

  14. A Bayesian Approach to Multistage Fitting of the Variation of the Skeletal Age Features

    Dong Hua


    Full Text Available Accurate assessment of skeletal maturity is important clinically. Skeletal age assessment is usually based on features encoded in ossification centers. Therefore, it is critical to design a mechanism to capture as much as possible characteristics of features. We have observed that given a feature, there exist stages of the skeletal age such that the variation pattern of the feature differs in these stages. Based on this observation, we propose a Bayesian cut fitting to describe features in response to the skeletal age. With our approach, appropriate positions for stage separation are determined automatically by a Bayesian approach, and a model is used to fit the variation of a feature within each stage. Our experimental results show that the proposed method surpasses the traditional fitting using only one line or one curve not only in the efficiency and accuracy of fitting but also in global and local feature characterization.

  15. A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network

    Zubair, Muhammad


    By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP ...

  16. Data-driven and Model-based Verification:a Bayesian Identification Approach

    Haesaert, S Sofie; Hof, van den, S.; Abate, A.


    This work develops a measurement-driven and model-based formal verification approach, applicable to systems with partly unknown dynamics. We provide a principled method, grounded on reachability analysis and on Bayesian inference, to compute the confidence that a physical system driven by external inputs and accessed under noisy measurements, verifies a temporal logic property. A case study is discussed, where we investigate the bounded- and unbounded-time safety of a partly unknown linear ti...

  17. Identification of information tonality based on Bayesian approach and neural networks

    Lande, D. V.


    A model of the identification of information tonality, based on Bayesian approach and neural networks was described. In the context of this paper tonality means positive or negative tone of both the whole information and its parts which are related to particular concepts. The method, its application is presented in the paper, is based on statistic regularities connected with the presence of definite lexemes in the texts. A distinctive feature of the method is its simplicity and versatility. A...

  18. A New Method for E-Government Procurement Using Collaborative Filtering and Bayesian Approach

    Shuai Zhang; Chengyu Xi; Yan Wang; Wenyu Zhang; Yanhong Chen


    Nowadays, as the Internet services increase faster than ever before, government systems are reinvented as E-government services. Therefore, government procurement sectors have to face challenges brought by the explosion of service information. This paper presents a novel method for E-government procurement (eGP) to search for the optimal procurement scheme (OPS). Item-based collaborative filtering and Bayesian approach are used to evaluate and select the candidate services to get the top-M re...

  19. Evaluating impacts using a BACI design, ratios, and a Bayesian approach with a focus on restoration.

    Conner, Mary M; Saunders, W Carl; Bouwes, Nicolaas; Jordan, Chris


    Before-after-control-impact (BACI) designs are an effective method to evaluate natural and human-induced perturbations on ecological variables when treatment sites cannot be randomly chosen. While effect sizes of interest can be tested with frequentist methods, using Bayesian Markov chain Monte Carlo (MCMC) sampling methods, probabilities of effect sizes, such as a ≥20 % increase in density after restoration, can be directly estimated. Although BACI and Bayesian methods are used widely for assessing natural and human-induced impacts for field experiments, the application of hierarchal Bayesian modeling with MCMC sampling to BACI designs is less common. Here, we combine these approaches and extend the typical presentation of results with an easy to interpret ratio, which provides an answer to the main study question-"How much impact did a management action or natural perturbation have?" As an example of this approach, we evaluate the impact of a restoration project, which implemented beaver dam analogs, on survival and density of juvenile steelhead. Results indicated the probabilities of a ≥30 % increase were high for survival and density after the dams were installed, 0.88 and 0.99, respectively, while probabilities for a higher increase of ≥50 % were variable, 0.17 and 0.82, respectively. This approach demonstrates a useful extension of Bayesian methods that can easily be generalized to other study designs from simple (e.g., single factor ANOVA, paired t test) to more complicated block designs (e.g., crossover, split-plot). This approach is valuable for estimating the probabilities of restoration impacts or other management actions. PMID:27613291

  20. Peering through a dirty window: A Bayesian approach to making mine detection decisions from noisy data

    For several reasons, Bayesian parameter estimation is superior to other methods for extracting features of a weak signal from noise. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since ''nuisance parameters'' can be dropped out of the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit for perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with the imperfections of real-world data, Bayesian approaches this ideal limit of performance more closely than other methods. A major unsolved problem in landmine detection is the fusion of data from multiple sensor types. Bayesian data fusion is only beginning to be explored as a solution to the problem. In single sensor processes Bayesian analysis can sense multiple parameters from the data stream of the one sensor. It does so by computing a joint probability density function of a set of parameter values from the sensor output. However, there is no inherent requirement that the information must come from a single sensor. If multiple sensors are applied to a single process, where several different parameters are implicit in each sensor output data stream, the joint probability density function of all the parameters of interest can be computed in exactly the same manner as the single sensor case. Thus, it is just as practical to base decisions on multiple sensor outputs as it is for single sensors. This should provide a practical way to combine the outputs of dissimilar sensors, such as ground penetrating radar and electromagnetic induction devices, producing a better detection decision than could be provided by either sensor alone

  1. Peering through a dirty window: A Bayesian approach to making mine detection decisions from noisy data

    Kercel, Stephen W.


    For several reasons, Bayesian parameter estimation is superior to other methods for extracting features of a weak signal from noise. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since ''nuisance parameters'' can be dropped out of the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit for perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with the imperfections of real-world data, Bayesian approaches this ideal limit of performance more closely than other methods. A major unsolved problem in landmine detection is the fusion of data from multiple sensor types. Bayesian data fusion is only beginning to be explored as a solution to the problem. In single sensor processes Bayesian analysis can sense multiple parameters from the data stream of the one sensor. It does so by computing a joint probability density function of a set of parameter values from the sensor output. However, there is no inherent requirement that the information must come from a single sensor. If multiple sensors are applied to a single process, where several different parameters are implicit in each sensor output data stream, the joint probability density function of all the parameters of interest can be computed in exactly the same manner as the single sensor case. Thus, it is just as practical to base decisions on multiple sensor outputs as it is for single sensors. This should provide a practical way to combine the outputs of dissimilar sensors, such as ground penetrating radar and electromagnetic induction devices, producing a better detection decision than could be provided by either sensor alone.

  2. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit

    Wong, Rowena Syn Yin; Ismail, Noor Azina


    Background and Objectives There are not many studies that attempt to model intensive care unit (ICU) risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU. Methods This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV) model. Bayesian Markov Chain Monte Carlo (MCMC) simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method. Results The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS) was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC) values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05) for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study. Conclusion Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of

  3. A Defence of the AR4’s Bayesian Approach to Quantifying Uncertainty

    Vezer, M. A.


    The field of climate change research is a kimberlite pipe filled with philosophic diamonds waiting to be mined and analyzed by philosophers. Within the scientific literature on climate change, there is much philosophical dialogue regarding the methods and implications of climate studies. To this date, however, discourse regarding the philosophy of climate science has been confined predominately to scientific - rather than philosophical - investigations. In this paper, I hope to bring one such issue to the surface for explicit philosophical analysis: The purpose of this paper is to address a philosophical debate pertaining to the expressions of uncertainty in the International Panel on Climate Change (IPCC) Fourth Assessment Report (AR4), which, as will be noted, has received significant attention in scientific journals and books, as well as sporadic glances from the popular press. My thesis is that the AR4’s Bayesian method of uncertainty analysis and uncertainty expression is justifiable on pragmatic grounds: it overcomes problems associated with vagueness, thereby facilitating communication between scientists and policy makers such that the latter can formulate decision analyses in response to the views of the former. Further, I argue that the most pronounced criticisms against the AR4’s Bayesian approach, which are outlined below, are misguided. §1 Introduction Central to AR4 is a list of terms related to uncertainty that in colloquial conversations would be considered vague. The IPCC attempts to reduce the vagueness of its expressions of uncertainty by calibrating uncertainty terms with numerical probability values derived from a subjective Bayesian methodology. This style of analysis and expression has stimulated some controversy, as critics reject as inappropriate and even misleading the association of uncertainty terms with Bayesian probabilities. [...] The format of the paper is as follows. The investigation begins (§2) with an explanation of

  4. Estimation of Housing Vacancy Distributions: Basic Bayesian Approach Using Utility Data

    Kumagai, K.; Matsuda, Y.; Ono, Y.


    In this study, we analyze the quality of water hydrant data for estimating housing vacancies based on their spatial relationships with the other geographical data that we consider are correlated with such vacancies. We compare with in-situ vacant house data in several small districts, thus verifying the applicability of the water hydrant data to the detection of vacant houses. Through applying Bayesian approach, we apply the water hydrant data and other geographical data to repeatedly Bayesian updating for the classification of vacant / no vacant houses. We discuss the results of this classification using the temporal intervals associated with turning off metering, fluctuations in local population density, the densities of water hydrants as indicators of vacancies and several other geographical data. We also conduct the feasibility study on visualisation for the estimation results of housing vacancy distributions derived from the fine spatial resolution data.

  5. A Bayesian approach to the semi-analytic model of galaxy formation: methodology

    Lu, Yu; Weinberg, Martin D; Katz, Neal S


    We believe that a wide range of physical processes conspire to shape the observed galaxy population but we remain unsure of their detailed interactions. The semi-analytic model (SAM) of galaxy formation uses multi-dimensional parameterizations of the physical processes of galaxy formation and provides a tool to constrain these underlying physical interactions. Because of the high dimensionality, the parametric problem of galaxy formation may be profitably tackled with a Bayesian-inference based approach, which allows one to constrain theory with data in a statistically rigorous way. In this paper, we develop a generalized SAM using the framework of Bayesian inference. We show that, with a parallel implementation of an advanced Markov-Chain Monte-Carlo algorithm, it is now possible to rigorously sample the posterior distribution of the high-dimensional parameter space of typical SAMs. As an example, we characterize galaxy formation in the current $\\Lambda$CDM cosmology using stellar mass function of galaxies a...

  6. Bayesian Lorentzian profile fitting using Markov-Chain Monte Carlo: An observer's approach

    Gruberbauer, M; Weiss, W W


    Aims. Investigating stochastically driven pulsation puts strong requirements on the quality of (observed) pulsation frequency spectra, such as the accuracy of frequencies, amplitudes, and mode life times and -- important when fitting these parameters with models -- a realistic error estimate which can be quite different to the formal error. As has been shown by other authors, the method of fitting Lorentzian profiles to the power spectrum of time-resolved photometric or spectroscopic data via the Maximum Likelihood Estimation (MLE) procedure delivers good approximations for these quantities. We, however, intend to demonstrate that a conservative Bayesian approach allows to treat this problem in a more consistent way. Methods. We derive a conservative Bayesian treatment for the probability of Lorentzian profiles being present in a power spectrum and describe its implementation via evaluating the probability density distribution of parameters by using the Markov-Chain Monte Carlo (MCMC) technique. In addition, ...

  7. Evaluating a Bayesian approach to improve accuracy of individual photographic identification methods using ecological distribution data

    Richard Stafford


    Full Text Available Photographic identification of individual organisms can be possible from natural body markings. Data from photo-ID can be used to estimate important ecological and conservation metrics such as population sizes, home ranges or territories. However, poor quality photographs or less well-studied individuals can result in a non-unique ID, potentially confounding several similar looking individuals. Here we present a Bayesian approach that uses known data about previous sightings of individuals at specific sites as priors to help assess the problems of obtaining a non-unique ID. Using a simulation of individuals with different confidence of correct ID we evaluate the accuracy of Bayesian modified (posterior probabilities. However, in most cases, the accuracy of identification decreases. Although this technique is unsuccessful, it does demonstrate the importance of computer simulations in testing such hypotheses in ecology.

  8. Sequential Bayesian Detection: A Model-Based Approach

    Candy, J V


    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  9. Transdimensional Bayesian approach to pulsar timing noise analysis

    Ellis, J. A.; Cornish, N. J.


    The modeling of intrinsic noise in pulsar timing residual data is of crucial importance for gravitational wave detection and pulsar timing (astro)physics in general. The noise budget in pulsars is a collection of several well-studied effects including radiometer noise, pulse-phase jitter noise, dispersion measure variations, and low-frequency spin noise. However, as pulsar timing data continue to improve, nonstationary and non-power-law noise terms are beginning to manifest which are not well modeled by current noise analysis techniques. In this work, we use a transdimensional approach to model these nonstationary and non-power-law effects through the use of a wavelet basis and an interpolation-based adaptive spectral modeling. In both cases, the number of wavelets and the number of control points in the interpolated spectrum are free parameters that are constrained by the data and then marginalized over in the final inferences, thus fully incorporating our ignorance of the noise model. We show that these new methods outperform standard techniques when nonstationary and non-power-law noise is present. We also show that these methods return results consistent with the standard analyses when no such signals are present.

  10. A Bayesian Approach to Risk Informed Performance Based Regulation for Digital I and C QA Programs

    The purpose of applying Risk Informed Performance Based Regulation (RIPBR) is to reduce unnecessary conservatism existed in current regulations. This paper proposes a systematic way to find such unnecessary conservatism based on Bayesian Belief Network (BBN) modeling technique. First, a Bayesian based QA process model is developed, and the correspondent event tree based on the BBN is then derived. Risk insight into different QA activities can thus be investigated by comparing their contribution to final quality to determine their necessity. Independent V and V, prescribed by RG 1.168, is selected as a case study to demonstrate the effectiveness of this approach. The proposed Bayesian approach appears to be very promising in supporting the RIPBR practice for digital I and C QA programs. Related issues and future work are also discussed. It is a consensus view between licensees and regulators that there may exists unnecessary conservatism in current digital I and C QA regulatory requirements. If such conservatism can be identified and reduced then the limited resources of both licensees and regulators can be utilized more effectively. The goal of RIPBR promoted by USNRC is to provide a generic regulatory framework to eliminate such conservatism in all NRC's regulatory activities (NRC, 1995). However, in order to take the advantage of RIPBR, one needs to develop techniques to identify unnecessary conservatism, and such techniques have not been fully established for digital I and C systems yet. This paper proposed a Bayesian-based approach to identifying unnecessary conservatism in current digital I and C QA program requirements. A QA program causal influence model is developed first, and then a correspondent event tree enumerating potential scenarios is derived based on this model. Thus risk insight into different QA activities can be investigated by comparing their contribution to scenario results. The QA activities that do not have significant impact on results

  11. Identification of information tonality based on Bayesian approach and neural networks

    Lande, D V


    A model of the identification of information tonality, based on Bayesian approach and neural networks was described. In the context of this paper tonality means positive or negative tone of both the whole information and its parts which are related to particular concepts. The method, its application is presented in the paper, is based on statistic regularities connected with the presence of definite lexemes in the texts. A distinctive feature of the method is its simplicity and versatility. At present ideologically similar approaches are widely used to control spam.

  12. Model‐Based Assessment of Alternative Study Designs in Pediatric Trials. Part II: Bayesian Approaches

    Smania, G; Baiardi, P; Ceci, A; Magni, P


    This study presents a pharmacokinetic‐pharmacodynamic based clinical trial simulation framework for evaluating the performance of a fixed‐sample Bayesian design (BD) and two alternative Bayesian sequential designs (BSDs) (i.e., a non‐hierarchical (NON‐H) and a semi‐hierarchical (SEMI‐H) one). Prior information was elicited from adult trials and weighted based on the expected similarity of response to treatment between the pediatric and adult populations. Study designs were evaluated in terms of: type I and II errors, sample size per arm (SS), trial duration (TD), and estimate precision. No substantial differences were observed between NON‐H and SEMI‐H. BSDs require, on average, smaller SS and TD compared to the BD, which, on the other hand, guarantees higher estimate precision. When large differences between children and adults are expected, BSDs can return very large SS. Bayesian approaches appear to outperform their frequentist counterparts in the design of pediatric trials even when little weight is given to prior information from adults. PMID:27530374

  13. A surrogate model enables a Bayesian approach to the inverse problem of scatterometry

    Scatterometry is an indirect optical method for the determination of photomask geometry parameters from scattered light intensities by solving an inverse problem. The Bayesian approach is a powerful method to solve the inverse problem. In the Bayesian framework estimates of parameters and associated uncertainties are obtained from posterior distributions. The determination the probability distribution is typically based on Markov chain Monte Carlo (MCMC) methods. However, in scatterometry the evaluation of MCMC steps require solutions of partial differential equations that are computationally expensive and application of MCMC methods is thus impractical. In this article we introduce a surrogate model for scatterometry based on polynomial chaos that can be treated by Bayesian inference. We compare the results of the surrogate model with rigorous finite element simulations and demonstrate its convergence. The accuracy reaches a value of lower than one percent for a sufficient fine mesh and the speed up amounts more than two order of magnitudes. Furthermore, we apply the surrogate model to MCMC calculations and we reconstruct geometry parameters of a photomask

  14. Applying a Bayesian Approach to Identification of Orthotropic Elastic Constants from Full Field Displacement Measurements

    Le Riche R.


    Full Text Available A major challenge in the identification of material properties is handling different sources of uncertainty in the experiment and the modelling of the experiment for estimating the resulting uncertainty in the identified properties. Numerous improvements in identification methods have provided increasingly accurate estimates of various material properties. However, characterizing the uncertainty in the identified properties is still relatively crude. Different material properties obtained from a single test are not obtained with the same confidence. Typically the highest uncertainty is associated with respect to properties to which the experiment is the most insensitive. In addition, the uncertainty in different properties can be strongly correlated, so that obtaining only variance estimates may be misleading. A possible approach for handling the different sources of uncertainty and estimating the uncertainty in the identified properties is the Bayesian method. This method was introduced in the late 1970s in the context of identification [1] and has been applied since to different problems, notably identification of elastic constants from plate vibration experiments [2]-[4]. The applications of the method to these classical pointwise tests involved only a small number of measurements (typically ten natural frequencies in the previously cited vibration test which facilitated the application of the Bayesian approach. For identifying elastic constants, full field strain or displacement measurements provide a high number of measured quantities (one measurement per image pixel and hence a promise of smaller uncertainties in the properties. However, the high number of measurements represents also a major computational challenge in applying the Bayesian approach to full field measurements. To address this challenge we propose an approach based on the proper orthogonal decomposition (POD of the full fields in order to drastically reduce their

  15. A Bayesian Approach to Combine Landsat and ALOS PALSAR Time Series for Near Real-Time Deforestation Detection

    Johannes Reiche; Sytze de Bruin; Dirk Hoekman; Jan Verbesselt; Martin Herold


    To address the need for timely information on newly deforested areas at medium resolution scale, we introduce a Bayesian approach to combine SAR and optical time series for near real-time deforestation detection. Once a new image of either of the input time series is available, the conditional probability of deforestation is computed using Bayesian updating, and deforestation events are indicated. Future observations are used to update the conditional probability of deforestation and, thus, t...

  16. A dynamic Bayesian network based approach to safety decision support in tunnel construction

    This paper presents a systemic decision approach with step-by-step procedures based on dynamic Bayesian network (DBN), aiming to provide guidelines for dynamic safety analysis of the tunnel-induced road surface damage over time. The proposed DBN-based approach can accurately illustrate the dynamic and updated feature of geological, design and mechanical variables as the construction progress evolves, in order to overcome deficiencies of traditional fault analysis methods. Adopting the predictive, sensitivity and diagnostic analysis techniques in the DBN inference, this approach is able to perform feed-forward, concurrent and back-forward control respectively on a quantitative basis, and provide real-time support before and after an accident. A case study in relating to dynamic safety analysis in the construction of Wuhan Yangtze Metro Tunnel in China is used to verify the feasibility of the proposed approach, as well as its application potential. The relationships between the DBN-based and BN-based approaches are further discussed according to analysis results. The proposed approach can be used as a decision tool to provide support for safety analysis in tunnel construction, and thus increase the likelihood of a successful project in a dynamic project environment. - Highlights: • A dynamic Bayesian network (DBN) based approach for safety decision support is developed. • This approach is able to perform feed-forward, concurrent and back-forward analysis and control. • A case concerning dynamic safety analysis in Wuhan Yangtze Metro Tunnel in China is presented. • DBN-based approach can perform a higher accuracy than traditional static BN-based approach

  17. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  18. Capturing changes in flood risk with Bayesian approaches for flood damage assessment

    Vogel, Kristin; Schröter, Kai; Kreibich, Heidi; Thieken, Annegret; Müller, Meike; Sieg, Tobias; Laudan, Jonas; Kienzler, Sarah; Weise, Laura; Merz, Bruno; Scherbaum, Frank


    Flood risk is a function of hazard as well as of exposure and vulnerability. All three components are under change over space and time and have to be considered for reliable damage estimations and risk analyses, since this is the basis for an efficient, adaptable risk management. Hitherto, models for estimating flood damage are comparatively simple and cannot sufficiently account for changing conditions. The Bayesian network approach allows for a multivariate modeling of complex systems without relying on expert knowledge about physical constraints. In a Bayesian network each model component is considered to be a random variable. The way of interactions between those variables can be learned from observations or be defined by expert knowledge. Even a combination of both is possible. Moreover, the probabilistic framework captures uncertainties related to the prediction and provides a probability distribution for the damage instead of a point estimate. The graphical representation of Bayesian networks helps to study the change of probabilities for changing circumstances and may thus simplify the communication between scientists and public authorities. In the framework of the DFG-Research Training Group "NatRiskChange" we aim to develop Bayesian networks for flood damage and vulnerability assessments of residential buildings and companies under changing conditions. A Bayesian network learned from data, collected over the last 15 years in flooded regions in the Elbe and Danube catchments (Germany), reveals the impact of many variables like building characteristics, precaution and warning situation on flood damage to residential buildings. While the handling of incomplete and hybrid (discrete mixed with continuous) data are the most challenging issues in the study on residential buildings, a similar study, that focuses on the vulnerability of small to medium sized companies, bears new challenges. Relying on a much smaller data set for the determination of the model

  19. Uncertainty analysis of pollutant build-up modelling based on a Bayesian weighted least squares approach

    Haddad, Khaled [School of Computing, Engineering and Mathematics, University of Western Sydney, Building XB, Locked Bag 1797, Penrith, NSW 2751 (Australia); Egodawatta, Prasanna [Science and Engineering Faculty, Queensland University of Technology, GPO Box 2434, Brisbane 4001 (Australia); Rahman, Ataur [School of Computing, Engineering and Mathematics, University of Western Sydney, Building XB, Locked Bag 1797, Penrith, NSW 2751 (Australia); Goonetilleke, Ashantha, E-mail: [Science and Engineering Faculty, Queensland University of Technology, GPO Box 2434, Brisbane 4001 (Australia)


    Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality datasets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares regression and Bayesian weighted least squares regression for the estimation of uncertainty associated with pollutant build-up prediction using limited datasets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling. - Highlights: ► Water quality data spans short time scales leading to significant model uncertainty. ► Assessment of uncertainty essential for informed decision making in water

  20. Bayesian approach for determination of energy response of some radiation detectors to neutrons

    Cabal, Arango Gonzalo Alfonso; Spurný, František; Turek, Karel

    Praha : ČVUT v Praze, 2007, s. 166-167. ISBN 978-80-01-03901-4. [Dny radiační ochrany /29./. Kouty nad Desnou, Hrubý Jeseník (CZ), 05.11.2007-09.11.2007] R&D Projects: GA ČR(CZ) GD202/05/H031; GA MŠk 1P05OC032 Institutional research plan: CEZ:AV0Z10480505 Keywords : Bayesian approach * radiation detectors * neutrons Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders

  1. Validation of Sustainable Development Practices Scale Using the Bayesian Approach to Item Response Theory

    Martin Hernani Merino


    Full Text Available There has been growing recognition of the importance of creating performance measurement tools for the economic, social and environmental management of micro and small enterprise (MSE. In this context, this study aims to validate an instrument to assess perceptions of sustainable development practices by MSEs by means of a Graded Response Model (GRM with a Bayesian approach to Item Response Theory (IRT. The results based on a sample of 506 university students in Peru, suggest that a valid measurement instrument was achieved. At the end of the paper, methodological and managerial contributions are presented.

  2. A Bayesian Approach to Jointly Estimate Tire Radii and Vehicle Trajectory

    Özkan, Emre; Lundquist, Christian; Gustafsson, Fredrik


    High-precision estimation of vehicle tire radii is considered, based on measurements on individual wheel speeds and absolute position from a global navigation satellite system (GNSS). The wheel speed measurements are subject to noise with time-varying covariance that depends mainly on the road surface. The novelty lies in a Bayesian approach to estimate online the time-varying radii and noise parameters using a marginalized particle filter, where no model approximations are needed such as in ...

  3. Crystalline nucleation in undercooled liquids: A Bayesian data-analysis approach for a nonhomogeneous Poisson process

    Filipponi, A.; Di Cicco, A.; Principi, E.


    A Bayesian data-analysis approach to data sets of maximum undercooling temperatures recorded in repeated melting-cooling cycles of high-purity samples is proposed. The crystallization phenomenon is described in terms of a nonhomogeneous Poisson process driven by a temperature-dependent sample nucleation rate J(T). The method was extensively tested by computer simulations and applied to real data for undercooled liquid Ge. It proved to be particularly useful in the case of scarce data sets where the usage of binned data would degrade the available experimental information.

  4. A new approach for supply chain risk management: Mapping SCOR into Bayesian network

    Mahdi Abolghasemi


    Full Text Available Purpose: Increase of costs and complexities in organizations beside the increase of uncertainty and risks have led the managers to use the risk management in order to decrease risk taking and deviation from goals. SCRM has a close relationship with supply chain performance. During the years different methods have been used by researchers in order to manage supply chain risk but most of them are either qualitative or quantitative. Supply chain operation reference (SCOR is a standard model for SCP evaluation which have uncertainty in its metrics. In This paper by combining qualitative and quantitative metrics of SCOR, supply chain performance will be measured by Bayesian Networks. Design/methodology/approach: First qualitative assessment will be done by recognizing uncertain metrics of SCOR model and then by quantifying them, supply chain performance will be measured by Bayesian Networks (BNs and supply chain operations reference (SCOR in which making decision on uncertain variables will be done by predictive and diagnostic capabilities. Findings: After applying the proposed method in one of the biggest automotive companies in Iran, we identified key factors of supply chain performance based on SCOR model through predictive and diagnostic capability of Bayesian Networks. After sensitivity analysis, we find out that ‘Total cost’ and its criteria that include costs of labors, warranty, transportation and inventory have the widest range and most effect on supply chain performance. So, managers should take their importance into account for decision making. We can make decisions simply by running model in different situations. Research limitations/implications: A more precise model consisted of numerous factors but it is difficult and sometimes impossible to solve big models, if we insert all of them in a Bayesian model. We have adopted real world characteristics with our software and method abilities. On the other hand, fewer data exist for some

  5. Life cycle reliability assessment of new products—A Bayesian model updating approach

    The rapidly increasing pace and continuously evolving reliability requirements of new products have made life cycle reliability assessment of new products an imperative yet difficult work. While much work has been done to separately estimate reliability of new products in specific stages, a gap exists in carrying out life cycle reliability assessment throughout all life cycle stages. We present a Bayesian model updating approach (BMUA) for life cycle reliability assessment of new products. Novel features of this approach are the development of Bayesian information toolkits by separately including “reliability improvement factor” and “information fusion factor”, which allow the integration of subjective information in a specific life cycle stage and the transition of integrated information between adjacent life cycle stages. They lead to the unique characteristics of the BMUA in which information generated throughout life cycle stages are integrated coherently. To illustrate the approach, an application to the life cycle reliability assessment of a newly developed Gantry Machining Center is shown

  6. Study on mapping Quantitative Trait Loci for animal complex binary traits using Bayesian-Markov chain Monte Carlo approach

    LIU; Jianfeng; ZHANG; Yuan; ZHANG; Qin; WANG; Lixian; ZHANG; Jigang


    It is a challenging issue to map Quantitative Trait Loci (QTL) underlying complex discrete traits, which usually show discontinuous distribution and less information, using conventional statistical methods. Bayesian-Markov chain Monte Carlo (Bayesian-MCMC) approach is the key procedure in mapping QTL for complex binary traits, which provides a complete posterior distribution for QTL parameters using all prior information. As a consequence, Bayesian estimates of all interested variables can be obtained straightforwardly basing on their posterior samples simulated by the MCMC algorithm. In our study, utilities of Bayesian-MCMC are demonstrated using simulated several animal outbred full-sib families with different family structures for a complex binary trait underlied by both a QTL and polygene. Under the Identity-by-Descent-Based variance component random model, three samplers basing on MCMC, including Gibbs sampling, Metropolis algorithm and reversible jump MCMC, were implemented to generate the joint posterior distribution of all unknowns so that the QTL parameters were obtained by Bayesian statistical inferring. The results showed that Bayesian-MCMC approach could work well and robust under different family structures and QTL effects. As family size increases and the number of family decreases, the accuracy of the parameter estimates will be improved. When the true QTL has a small effect, using outbred population experiment design with large family size is the optimal mapping strategy.

  7. Multimethod, multistate Bayesian hierarchical modeling approach for use in regional monitoring of wolves.

    Jiménez, José; García, Emilio J; Llaneza, Luis; Palacios, Vicente; González, Luis Mariano; García-Domínguez, Francisco; Múñoz-Igualada, Jaime; López-Bao, José Vicente


    In many cases, the first step in large-carnivore management is to obtain objective, reliable, and cost-effective estimates of population parameters through procedures that are reproducible over time. However, monitoring predators over large areas is difficult, and the data have a high level of uncertainty. We devised a practical multimethod and multistate modeling approach based on Bayesian hierarchical-site-occupancy models that combined multiple survey methods to estimate different population states for use in monitoring large predators at a regional scale. We used wolves (Canis lupus) as our model species and generated reliable estimates of the number of sites with wolf reproduction (presence of pups). We used 2 wolf data sets from Spain (Western Galicia in 2013 and Asturias in 2004) to test the approach. Based on howling surveys, the naïve estimation (i.e., estimate based only on observations) of the number of sites with reproduction was 9 and 25 sites in Western Galicia and Asturias, respectively. Our model showed 33.4 (SD 9.6) and 34.4 (3.9) sites with wolf reproduction, respectively. The number of occupied sites with wolf reproduction was 0.67 (SD 0.19) and 0.76 (0.11), respectively. This approach can be used to design more cost-effective monitoring programs (i.e., to define the sampling effort needed per site). Our approach should inspire well-coordinated surveys across multiple administrative borders and populations and lead to improved decision making for management of large carnivores on a landscape level. The use of this Bayesian framework provides a simple way to visualize the degree of uncertainty around population-parameter estimates and thus provides managers and stakeholders an intuitive approach to interpreting monitoring results. Our approach can be widely applied to large spatial scales in wildlife monitoring where detection probabilities differ between population states and where several methods are being used to estimate different population

  8. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    Raj, Raghu G.


    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  9. Bayesian Mediation Analysis

    Yuan, Ying; MacKinnon, David P.


    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  10. Adaptive Methods within a Sequential Bayesian Approach for Structural Health Monitoring

    Huff, Daniel W.

    Structural integrity is an important characteristic of performance for critical components used in applications such as aeronautics, materials, construction and transportation. When appraising the structural integrity of these components, evaluation methods must be accurate. In addition to possessing capability to perform damage detection, the ability to monitor the level of damage over time can provide extremely useful information in assessing the operational worthiness of a structure and in determining whether the structure should be repaired or removed from service. In this work, a sequential Bayesian approach with active sensing is employed for monitoring crack growth within fatigue-loaded materials. The monitoring approach is based on predicting crack damage state dynamics and modeling crack length observations. Since fatigue loading of a structural component can change while in service, an interacting multiple model technique is employed to estimate probabilities of different loading modes and incorporate this information in the crack length estimation problem. For the observation model, features are obtained from regions of high signal energy in the time-frequency plane and modeled for each crack length damage condition. Although this observation model approach exhibits high classification accuracy, the resolution characteristics can change depending upon the extent of the damage. Therefore, several different transmission waveforms and receiver sensors are considered to create multiple modes for making observations of crack damage. Resolution characteristics of the different observation modes are assessed using a predicted mean squared error criterion and observations are obtained using the predicted, optimal observation modes based on these characteristics. Calculation of the predicted mean square error metric can be computationally intensive, especially if performed in real time, and an approximation method is proposed. With this approach, the real time

  11. Heart rate variability estimation in photoplethysmography signals using Bayesian learning approach.

    Alqaraawi, Ahmed; Alwosheel, Ahmad; Alasaad, Amr


    Heart rate variability (HRV) has become a marker for various health and disease conditions. Photoplethysmography (PPG) sensors integrated in wearable devices such as smart watches and phones are widely used to measure heart activities. HRV requires accurate estimation of time interval between consecutive peaks in the PPG signal. However, PPG signal is very sensitive to motion artefact which may lead to poor HRV estimation if false peaks are detected. In this Letter, the authors propose a probabilistic approach based on Bayesian learning to better estimate HRV from PPG signal recorded by wearable devices and enhance the performance of the automatic multi scale-based peak detection (AMPD) algorithm used for peak detection. The authors' experiments show that their approach enhances the performance of the AMPD algorithm in terms of number of HRV related metrics such as sensitivity, positive predictive value, and average temporal resolution. PMID:27382483

  12. Combining bayesian source imaging with equivalent dipole approach to solve the intracranial EEG source localization problem.

    Le Cam, Steven; Caune, Vairis; Ranta, Radu; Korats, Gundars; Louis-Dorr, Valerie


    The brain source localization problem has been extensively studied in the past years, yielding a large panel of methodologies, each bringing their own strengths and weaknesses. Combining several of these approaches might help in enhancing their respective performance. Our study is carried out in the particular context of intracranial recordings, with the objective to explain the measurements based on a reduced number of dipolar activities. We take benefit of the sparse nature of the Bayesian approaches to separate the noise from the source space, and to distinguish between several source contributions on the electrodes. This first step provides accurate estimates of the dipole projections, which can be used as an entry to an equivalent current dipole fitting procedure. We demonstrate on simulations that the localization results are significantly enhanced by this post-processing step when up to five dipoles are activated simultaneously. PMID:26736344

  13. Default Bayesian analysis for multi-way tables: a data-augmentation approach

    Polson, Nicholas G


    This paper proposes a strategy for regularized estimation in multi-way contingency tables, which are common in meta-analyses and multi-center clinical trials. Our approach is based on data augmentation, and appeals heavily to a novel class of Polya-Gamma distributions. Our main contributions are to build up the relevant distributional theory and to demonstrate three useful features of this data-augmentation scheme. First, it leads to simple EM and Gibbs-sampling algorithms for posterior inference, circumventing the need for analytic approximations, numerical integration, Metropolis--Hastings, or variational methods. Second, it allows modelers much more flexibility when choosing priors, which have traditionally come from the Dirichlet or logistic-normal family. For example, our approach allows users to incorporate Bayesian analogues of classical penalized-likelihood techniques (e.g. the lasso or bridge) in computing regularized estimates for log-odds ratios. Finally, our data-augmentation scheme naturally sugg...

  14. A hierarchical Bayesian approach for reconstructing the Initial Mass Function of Single Stellar Populations

    Dries, M; Koopmans, L V E


    Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov Chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age, and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities, and IMFs. When systematic unc...

  15. An evaluation of tsunami hazard using Bayesian approach in the Indian Ocean

    Yadav, R. B. S.; Tsapanos, T. M.; Tripathi, J. N.; Chopra, S.


    hazard assessment in the Indian Ocean region. Tsunami hazard is calculated in the Indian Ocean using Bayesian approach. The maximum tsunami intensity is estimated as 5.39 ± 0.30 for Indian Ocean. Tsunami intensity for 50 years with 90% probability is estimated as 4.96 ± 0.25. The tsunami activity rate is estimated as 0.08events/year and β-value as 0.81 ± 0.19. Tsunami hazard is also calculated for Andaman-Sumatra-Java (ASJ) region.

  16. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10−4), 283 for the intensity approach (p = 2  ×  10−6) and 282 without

  17. Exploring the Influence of Neighborhood Characteristics on Burglary Risks: A Bayesian Random Effects Modeling Approach

    Hongqiang Liu


    Full Text Available A Bayesian random effects modeling approach was used to examine the influence of neighborhood characteristics on burglary risks in Jianghan District, Wuhan, China. This random effects model is essentially spatial; a spatially structured random effects term and an unstructured random effects term are added to the traditional non-spatial Poisson regression model. Based on social disorganization and routine activity theories, five covariates extracted from the available data at the neighborhood level were used in the modeling. Three regression models were fitted and compared by the deviance information criterion to identify which model best fit our data. A comparison of the results from the three models indicates that the Bayesian random effects model is superior to the non-spatial models in fitting the data and estimating regression coefficients. Our results also show that neighborhoods with above average bar density and department store density have higher burglary risks. Neighborhood-specific burglary risks and posterior probabilities of neighborhoods having a burglary risk greater than 1.0 were mapped, indicating the neighborhoods that should warrant more attention and be prioritized for crime intervention and reduction. Implications and limitations of the study are discussed in our concluding section.

  18. Quantifying uncertainty and resilience on coral reefs using a Bayesian approach

    Coral reefs are rapidly deteriorating globally. The contemporary management option favors managing for resilience to provide reefs with the capacity to tolerate human-induced disturbances. Yet resilience is most commonly defined as the capacity of a system to absorb disturbances without changing fundamental processes or functionality. Quantifying no change, or the uncertainty of a null hypothesis, is nonsensical using frequentist statistics, but is achievable using a Bayesian approach. This study outlines a practical Bayesian framework that quantifies the resilience of coral reefs using two inter-related models. The first model examines the functionality of coral reefs in the context of their reef-building capacity, whereas the second model examines the recovery rates of coral cover after disturbances. Quantifying intrinsic rates of increase in coral cover and habitat-specific, steady-state equilibria are useful proxies of resilience. A reduction in the intrinsic rate of increase following a disturbance, or the slowing of recovery over time, can be useful indicators of stress; a change in the steady-state equilibrium suggests a phase shift. Quantifying the uncertainty of key reef-building processes and recovery parameters, and comparing these parameters against benchmarks, facilitates the detection of loss of resilience and provides signals of imminent change. (letter)

  19. Time-calibrated phylogenomics of the classical swine fever viruses: genome-wide bayesian coalescent approach.

    Taehyung Kwon

    Full Text Available The phylogeny of classical swine fever virus (CSFV, the causative agent of classical swine fever (CSF, has been investigated extensively. However, no evolutionary research has been performed using the whole CSFV genome. In this study, we used 37 published genome sequences to investigate the time-calibrated phylogenomics of CSFV. In phylogenomic trees based on Bayesian inference (BI and Maximum likelihood (ML, the 37 isolates were categorized into five genetic types (1.1, 1.2, 2.1, 2.3, and 3.4. Subgenotype 1.1 is divided into 3 groups and 1 unclassified isolate, 2.1 into 4 groups, 2.3 into 2 groups and 1 unclassified isolate, and subgenotype 1.2 and 3.4 consisted of one isolate each. We did not observe an apparent temporal or geographical relationship between isolates. Of the 14 genomic regions, NS4B showed the most powerful phylogenetic signal. Results of this evolutionary study using Bayesian coalescent approach indicate that CSFV has evolved at a rate of 13×.010-4 substitutions per site per year. The most recent common ancestor of CSFV appeared 2770.2 years ago, which was about 8000 years after pig domestication. The effective population size of CSFV underwent a slow increase until the 1950s, after which it has remained constant.

  20. Time-calibrated phylogenomics of the classical swine fever viruses: genome-wide bayesian coalescent approach.

    Kwon, Taehyung; Yoon, Sook Hee; Kim, Kyu-Won; Caetano-Anolles, Kelsey; Cho, Seoae; Kim, Heebal


    The phylogeny of classical swine fever virus (CSFV), the causative agent of classical swine fever (CSF), has been investigated extensively. However, no evolutionary research has been performed using the whole CSFV genome. In this study, we used 37 published genome sequences to investigate the time-calibrated phylogenomics of CSFV. In phylogenomic trees based on Bayesian inference (BI) and Maximum likelihood (ML), the 37 isolates were categorized into five genetic types (1.1, 1.2, 2.1, 2.3, and 3.4). Subgenotype 1.1 is divided into 3 groups and 1 unclassified isolate, 2.1 into 4 groups, 2.3 into 2 groups and 1 unclassified isolate, and subgenotype 1.2 and 3.4 consisted of one isolate each. We did not observe an apparent temporal or geographical relationship between isolates. Of the 14 genomic regions, NS4B showed the most powerful phylogenetic signal. Results of this evolutionary study using Bayesian coalescent approach indicate that CSFV has evolved at a rate of 13×.010-4 substitutions per site per year. The most recent common ancestor of CSFV appeared 2770.2 years ago, which was about 8000 years after pig domestication. The effective population size of CSFV underwent a slow increase until the 1950s, after which it has remained constant. PMID:25815768

  1. Assessment of successful smoking cessation by psychological factors using the Bayesian network approach.

    Yang, Xiaorong; Li, Suyun; Pan, Lulu; Wang, Qiang; Li, Huijie; Han, Mingkui; Zhang, Nan; Jiang, Fan; Jia, Chongqi


    The association between psychological factors and smoking cessation is complicated and inconsistent in published researches, and the joint effect of psychological factors on smoking cessation is unclear. This study explored how psychological factors jointly affect the success of smoking cessation using a Bayesian network approach. A community-based case control study was designed with 642 adult male successful smoking quitters as the cases, and 700 adult male failed smoking quitters as the controls. General self-efficacy (GSE), trait coping style (positive-trait coping style (PTCS) and negative-trait coping style (NTCS)) and self-rating anxiety (SA) were evaluated by GSE Scale, Trait Coping Style Questionnaire and SA Scale, respectively. Bayesian network was applied to evaluate the relationship between psychological factors and successful smoking cessation. The local conditional probability table of smoking cessation indicated that different joint conditions of psychological factors led to different outcomes for smoking cessation. Among smokers with high PTCS, high NTCS and low SA, only 36.40% successfully quitted smoking. However, among smokers with low pack-years of smoking, high GSE, high PTCS and high SA, 63.64% successfully quitted smoking. Our study indicates psychological factors jointly influence smoking cessation outcome. According to different joint situations, different solutions should be developed to control tobacco in practical intervention. PMID:26264661

  2. Forecasting Rainfall Time Series with stochastic output approximated by neural networks Bayesian approach

    Cristian Rodriguez Rivero


    Full Text Available The annual estimate of the availability of the amount of water for the agricultural sector has become a lifetime in places where rainfall is scarce, as is the case of northwestern Argentina. This work proposes to model and simulate monthly rainfall time series from one geographical location of Catamarca, Valle El Viejo Portezuelo. In this sense, the time series prediction is mathematical and computational modelling series provided by monthly cumulative rainfall, which has stochastic output approximated by neural networks Bayesian approach. We propose to use an algorithm based on artificial neural networks (ANNs using the Bayesian inference. The result of the prediction consists of 20% of the provided data consisting of 2000 to 2010. A new analysis for modelling, simulation and computational prediction of cumulative rainfall from one geographical location is well presented. They are used as data information, only the historical time series of daily flows measured in mmH2O. Preliminary results of the annual forecast in mmH2O with a prediction horizon of one year and a half are presented, 18 months, respectively. The methodology employs artificial neural network based tools, statistical analysis and computer to complete the missing information and knowledge of the qualitative and quantitative behavior. They also show some preliminary results with different prediction horizons of the proposed filter and its comparison with the performance Gaussian process filter used in the literature.

  3. Application of a Bayesian approach to stochastic delineation of capture zones.

    Feyen, L; Dessalegn, A M; De Smedt, F; Gebremeskel, S; Batelaan, O


    This paper presents a Bayesian Monte Carlo method for evaluating the uncertainty in the delineation of well capture zones and its application to a wellfield in a heterogeneous, multiaquifer system. In the method presented, Bayes' rule is used to update prior distributions for the unknown parameters of the stochastic model for the hydraulic conductivity, and to calculate probability-based weights for parameter realizations using head residuals. These weights are then assigned to the corresponding capture zones obtained using forward particle tracking. Statistical analysis of the set of weighted protection zones results in a probability distribution for the capture zones. The suitability of the Bayesian stochastic method for a multilayered system is investigated, using the wellfield Het Rot at Nieuwrode, Belgium, located in a three-layered aquifer system, as an example. The hydraulic conductivity of the production aquifer is modeled as a spatially correlated random function with uncertain parameters. The aquitard and overlying unconfined aquifer are assigned random, homogeneous conductivities. The stochastic results are compared with deterministic capture zones obtained with a calibrated model for the area. The predictions of the stochastic approach are more conservative and indicate that parameter uncertainty should be taken into account in the delineation of well capture zones. PMID:15318777

  4. Nuclear mass predictions for the crustal composition of neutron stars: A Bayesian neural network approach

    Utama, R.; Piekarewicz, J.; Prosper, H. B.


    Background: Besides their intrinsic nuclear-structure value, nuclear mass models are essential for astrophysical applications, such as r -process nucleosynthesis and neutron-star structure. Purpose: To overcome the intrinsic limitations of existing "state-of-the-art" mass models through a refinement based on a Bayesian neural network (BNN) formalism. Methods: A novel BNN approach is implemented with the goal of optimizing mass residuals between theory and experiment. Results: A significant improvement (of about 40%) in the mass predictions of existing models is obtained after BNN refinement. Moreover, these improved results are now accompanied by proper statistical errors. Finally, by constructing a "world average" of these predictions, a mass model is obtained that is used to predict the composition of the outer crust of a neutron star. Conclusions: The power of the Bayesian neural network method has been successfully demonstrated by a systematic improvement in the accuracy of the predictions of nuclear masses. Extension to other nuclear observables is a natural next step that is currently under investigation.

  5. Bayesian clustering of fuzzy feature vectors using a quasi-likelihood approach.

    Marttinen, Pekka; Tang, Jing; De Baets, Bernard; Dawyndt, Peter; Corander, Jukka


    Bayesian model-based classifiers, both unsupervised and supervised, have been studied extensively and their value and versatility have been demonstrated on a wide spectrum of applications within science and engineering. A majority of the classifiers are built on the assumption of intrinsic discreteness of the considered data features or on the discretization of them prior to the modeling. On the other hand, Gaussian mixture classifiers have also been utilized to a large extent for continuous features in the Bayesian framework. Often the primary reason for discretization in the classification context is the simplification of the analytical and numerical properties of the models. However, the discretization can be problematic due to its \\textit{ad hoc} nature and the decreased statistical power to detect the correct classes in the resulting procedure. We introduce an unsupervised classification approach for fuzzy feature vectors that utilizes a discrete model structure while preserving the continuous characteristics of data. This is achieved by replacing the ordinary likelihood by a binomial quasi-likelihood to yield an analytical expression for the posterior probability of a given clustering solution. The resulting model can be justified from an information-theoretic perspective. Our method is shown to yield highly accurate clusterings for challenging synthetic and empirical data sets. PMID:19029547

  6. A Bayesian approach to the ICH Q8 definition of design space.

    Peterson, John J


    Manufacturers of pharmaceuticals and biopharmaceuticals are facing increased regulatory pressure to understand how their manufacturing processes work and to be able to quantify the reliability and robustness of their manufacturing processes. In particular, the ICH Q8 guidance has introduced the concept of design space. The ICH Q8 defines design space as "the multidimensional combination and interaction of input variables (e.g., material attributes) and process parameters that have been demonstrated to provide assurance of quality." However, relatively little has been put forth to date on how to construct a design space from data composed of such variables. This study presents a Bayesian approach to design space based upon a type of credible region first appearing in Peterson's work.This study considers the issues of constructing a Bayesian design space, design space reliability, the inclusion of process noise variables, and utilization of prior information, as well as an outline for organizing information about a design space so that manufacturing engineers can make informed changes as may be needed within the design space. PMID:18781528

  7. Estimation of under-reported visceral Leishmaniasis (Vl cases in Bihar: a Bayesian approach

    A Ranjan


    Full Text Available Background: Visceral leishmaniasis (VL is a major health problem in the state of Bihar and adjoining areas in India. In absence of any active surveillance mechanism for the disease, there seems to be gross under-reporting of VL cases. Objective: The objective of this study was to estimate extent of under-reporting of VL cases in Bihar using pooled analysis of published papers. Method: We calculated the pooled common ratio (RRMH based on three studies and combined it with a prior distribution of ratio using inverse-variance weighting method. Bayesian method was used to estimate the posterior distribution of the “under-reporting factor” (ratio of unreported to reported cases. Results: The posterior distribution of ratio of unreported to reported cases yielded a mean of 3.558, with 95% posterior limits of 2.81 and 4.50. Conclusion: Bayesian approach gives evidence to the fact that the total number of VL cases in the state may be nearly more than three times that of currently reported figures. 

  8. A simple Bayesian approach to quantifying confidence level of adverse event incidence proportion in small samples.

    Liu, Fang


    In both clinical development and post-marketing of a new therapy or a new treatment, incidence of an adverse event (AE) is always a concern. When sample sizes are small, large sample-based inferential approaches on an AE incidence proportion in a certain time period no longer apply. In this brief discussion, we introduce a simple Bayesian framework to quantify, in small sample studies and the rare AE case, (1) the confidence level that the incidence proportion of a particular AE p is over or below a threshold, (2) the lower or upper bounds on p with a certain level of confidence, and (3) the minimum required number of patients with an AE before we can be certain that p surpasses a specific threshold, or the maximum allowable number of patients with an AE after which we can no longer be certain that p is below a certain threshold, given a certain confidence level. The method is easy to understand and implement; the interpretation of the results is intuitive. This article also demonstrates the usefulness of simple Bayesian concepts when it comes to answering practical questions. PMID:26098967

  9. Dynamic probability evaluation of safety levels of earth-rockfill dams using Bayesian approach

    Zi-wu FAN; Shu-hai JIANG; Ming ZHANG


    In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural characteristics of dams, it is quite difficult to predict the time-varying factors affecting their safety levels. It is not feasible to employ dynamic reliability indices to evaluate the actual safety levels of dams. Based on the relevant regulations for dam safety classification in China, a dynamic probability description of dam safety levels was developed. Using the Bayesian approach and effective information mining, as well as real-time information, this study achieved more rational evaluation and prediction of dam safety levels. With the Bayesian expression of discrete stochastic variables, the a priori probabilities of the dam safety levels determined by experts were combined with the likelihood probability of the real-time check information, and the probability information for the evaluation of dam safety levels was renewed. The probability index was then applied to dam rehabilitation decision-making. This method helps reduce the difficulty and uncertainty of the evaluation of dam safety levels and complies with the current safe decision-making regulations for dams in China. It also enhances the application of current risk analysis methods for dam safety levels.

  10. A risk management process for reinforced concrete structures by coupling modelling, monitoring and Bayesian approaches

    The impact of steel corrosion on the durability of reinforced concrete structures has since a long time been a major concern in civil engineering. The main electrochemical mechanisms of the steel corrosion are know well known. The material and structure degradation is attributed to the progressive formation of an expansive corrosion product at the steel-concrete interface. To assess quantitatively the structure lifetime, a two-stage service life model has been accepted widely. So far, the research attention is mainly given to the corrosion in an un-cracked concrete. However. practically one is often confronted to the reinforcement corrosion in an already cracked concrete. How to quantify the corrosion risk is of great interest for the long term durability of these cracked structures. To this end, this paper proposes a service life modeling for the corrosion process by carbonation in a cracked or un-cracked concrete depending on the observation or monitoring data available. Some recent experimental investigations are used to calibrate the models. Then, the models are applied to a shell structure to quantify the corrosion process and determine the optimal maintenance strategy. As corrosion processes are very difficult to model and subjected to material and environmental random variations, an example of structure reassessment is presented taking into account in situ information by the mean of Bayesian approaches. The coupling of monitoring, modelling and updating leads to a new global maintenance strategy of infrastructure. In conclusion: This paper presents an unified methodology coupling predictive models, observations and Bayesian approaches in order to assess the degradation degree of an ageing structure. The particular case of corrosion is treated on an innovative way by the development of a service life model taking into account cracking effects on the kinetics of the phenomena. At a material level, the dominant factors are the crack opening and the crack nature

  11. A Bayesian approach for temporally scaling climate for modeling ecological systems.

    Post van der Burg, Max; Anteau, Michael J; McCauley, Lisa A; Wiltermuth, Mark T


    With climate change becoming more of concern, many ecologists are including climate variables in their system and statistical models. The Standardized Precipitation Evapotranspiration Index (SPEI) is a drought index that has potential advantages in modeling ecological response variables, including a flexible computation of the index over different timescales. However, little development has been made in terms of the choice of timescale for SPEI. We developed a Bayesian modeling approach for estimating the timescale for SPEI and demonstrated its use in modeling wetland hydrologic dynamics in two different eras (i.e., historical [pre-1970] and contemporary [post-2003]). Our goal was to determine whether differences in climate between the two eras could explain changes in the amount of water in wetlands. Our results showed that wetland water surface areas tended to be larger in wetter conditions, but also changed less in response to climate fluctuations in the contemporary era. We also found that the average timescale parameter was greater in the historical period, compared with the contemporary period. We were not able to determine whether this shift in timescale was due to a change in the timing of wet-dry periods or whether it was due to changes in the way wetlands responded to climate. Our results suggest that perhaps some interaction between climate and hydrologic response may be at work, and further analysis is needed to determine which has a stronger influence. Despite this, we suggest that our modeling approach enabled us to estimate the relevant timescale for SPEI and make inferences from those estimates. Likewise, our approach provides a mechanism for using prior information with future data to assess whether these patterns may continue over time. We suggest that ecologists consider using temporally scalable climate indices in conjunction with Bayesian analysis for assessing the role of climate in ecological systems. PMID:27217947

  12. A Bayesian approach for temporally scaling climate for modeling ecological systems

    Post van der Burg, Max; Anteau, Michael J.; McCauley, Lisa A.; Wiltermuth, Mark T.


    With climate change becoming more of concern, many ecologists are including climate variables in their system and statistical models. The Standardized Precipitation Evapotranspiration Index (SPEI) is a drought index that has potential advantages in modeling ecological response variables, including a flexible computation of the index over different timescales. However, little development has been made in terms of the choice of timescale for SPEI. We developed a Bayesian modeling approach for estimating the timescale for SPEI and demonstrated its use in modeling wetland hydrologic dynamics in two different eras (i.e., historical [pre-1970] and contemporary [post-2003]). Our goal was to determine whether differences in climate between the two eras could explain changes in the amount of water in wetlands. Our results showed that wetland water surface areas tended to be larger in wetter conditions, but also changed less in response to climate fluctuations in the contemporary era. We also found that the average timescale parameter was greater in the historical period, compared with the contemporary period. We were not able to determine whether this shift in timescale was due to a change in the timing of wet–dry periods or whether it was due to changes in the way wetlands responded to climate. Our results suggest that perhaps some interaction between climate and hydrologic response may be at work, and further analysis is needed to determine which has a stronger influence. Despite this, we suggest that our modeling approach enabled us to estimate the relevant timescale for SPEI and make inferences from those estimates. Likewise, our approach provides a mechanism for using prior information with future data to assess whether these patterns may continue over time. We suggest that ecologists consider using temporally scalable climate indices in conjunction with Bayesian analysis for assessing the role of climate in ecological systems.

  13. Bayesian data analysis

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B


    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  14. An urban flood risk assessment method using the Bayesian Network approach

    Åström, Helena Lisa Alexandra

    Flooding is one of the most damaging natural hazards to human societies. Recent decades have shown that flooding constitutes major threats worldwide, and due to anticipated climate change the occurrence of damaging flood events is expected to increase. Urban areas are especially vulnerable to...... flood risk scoping, flood risk assessment (FRA), and adaptation implementation and involves an ongoing process of assessment, reassessment, and response. This thesis mainly focuses on the FRA phase of FRM. FRA includes hazard analysis and impact assessment (combined called a risk analysis), adaptation...... Bayesian Network (BN) approach is developed, and the method is exemplified in an urban catchment. BNs have become an increasingly popular method for describing complex systems and aiding decision-making under uncertainty. In environmental management, BNs have mainly been utilized in ecological assessments...

  15. A Bayesian approach to quantifying uncertainty from experimental noise in DEER spectroscopy

    Edwards, Thomas H.; Stoll, Stefan


    Double Electron-Electron Resonance (DEER) spectroscopy is a solid-state pulse Electron Paramagnetic Resonance (EPR) experiment that measures distances between unpaired electrons, most commonly between protein-bound spin labels separated by 1.5-8 nm. From the experimental data, a distance distribution P (r) is extracted using Tikhonov regularization. The disadvantage of this method is that it does not directly provide error bars for the resulting P (r) , rendering correct interpretation difficult. Here we introduce a Bayesian statistical approach that quantifies uncertainty in P (r) arising from noise and numerical regularization. This method provides credible intervals (error bars) of P (r) at each r . This allows practitioners to answer whether or not small features are significant, whether or not apparent shoulders are significant, and whether or not two distance distributions are significantly different from each other. In addition, the method quantifies uncertainty in the regularization parameter.

  16. Spectro-photometric distances to stars: a general-purpose Bayesian approach

    Santiago, Basílio X; Anders, Friedrich; Chiappini, Cristina; Girardi, Léo; Rocha-Pinto, Helio J; Balbinot, Eduardo; da Costa, Luiz N; Maia, Marcio A G; Schultheis, Mathias; Steinmetz, Matthias; Miglio, Andrea; Montalbán, Josefina; Schneider, Donald P; Beers, Timothy C; Frinchaboy, Peter M; Lee, Young Sun; Zasowski, Gail


    We have developed a procedure that estimates distances to stars using measured spectroscopic and photometric quantities. It employs a Bayesian approach to build the probability distribution function over stellar evolutionary models given the data, delivering estimates of expected distance for each star individually. Our method provides several alternative distance estimates for each star in the output, along with their associated uncertainties. The code was first tested on simulations, successfully recovering input distances to mock stars with errors that scale with the uncertainties in the adopted spectro-photometric parameters, as expected. The code was then validated by comparing our distance estimates to parallax measurements from the Hipparcos mission for nearby stars (< 60 pc), to asteroseismic distances of CoRoT red giant stars, and to known distances of well-studied open and globular clusters. The photometric data of these reference samples cover both the optical and near infra-red wavelengths. The...

  17. Introducing a Bayesian Approach to Determining Degree of Fit With Existing Rorschach Norms.

    Giromini, Luciano; Viglione, Donald J; McCullaugh, Joseph


    This article offers a new methodological approach to investigate the degree of fit between an independent sample and 2 existing sets of norms. Specifically, with a new adaptation of a Bayesian method, we developed a user-friendly procedure to compare the mean values of a given sample to those of 2 different sets of Rorschach norms. To illustrate our technique, we used a small, U.S. community sample of 80 adults and tested whether it resembled more closely the standard Comprehensive System norms (CS 600; Exner, 2003), or a recently introduced, internationally based set of Rorschach norms (Meyer, Erdberg, & Shaffer, 2007 ). Strengths and limitations of this new statistical technique are discussed. PMID:25257792

  18. Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction

    Smith, Jeffrey C; Van Cleve, Jeffrey E; Jenkins, Jon M; Barclay, Thomas S; Fanelli, Michael N; Girouard, Forrest R; Kolodziejczak, Jeffery J; McCauliff, Sean D; Morris, Robert L; Twicken, Joseph D


    With the unprecedented photometric precision of the Kepler Spacecraft, significant systematic and stochastic errors on transit signal levels are observable in the Kepler photometric data. These errors, which include discontinuities, outliers, systematic trends and other instrumental signatures, obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline tries to remove these errors while preserving planet transits and other astrophysically interesting signals. The completely new noise and stellar variability regime observed in Kepler data poses a significant problem to standard cotrending methods such as SYSREM and TFA. Variable stars are often of particular astrophysical interest so the preservation of their signals is of significant importance to the astrophysical community. We present a Bayesian Maximum A Posteriori (MAP) approach where a subset of highly correlated and quiet stars is used to generate a cotrending basis vector set which is in turn used t...

  19. An efficient multiple particle filter based on the variational Bayesian approach

    Ait-El-Fquih, Boujemaa


    This paper addresses the filtering problem in large-dimensional systems, in which conventional particle filters (PFs) remain computationally prohibitive owing to the large number of particles needed to obtain reasonable performances. To overcome this drawback, a class of multiple particle filters (MPFs) has been recently introduced in which the state-space is split into low-dimensional subspaces, and then a separate PF is applied to each subspace. In this paper, we adopt the variational Bayesian (VB) approach to propose a new MPF, the VBMPF. The proposed filter is computationally more efficient since the propagation of each particle requires generating one (new) particle only, while in the standard MPFs a set of (children) particles needs to be generated. In a numerical test, the proposed VBMPF behaves better than the PF and MPF.

  20. Genomic Predictions of Obesity Related Phenotypes in a Pig model using GBLUP and Bayesian Approaches

    Pant, Sameer Dinkar; Do, Duy Ngoc; Janss, Luc;

    Whole genome prediction (WGP) based on GBLUP and Bayesian models (e.g. A, B, C and R) are routinely used in animal breeding but have not been well tested in a genetic mapping population that segregates for QTLs. In our pig model experiment, purebred Duroc and Yorkshire sows were crossed with...... to partition genomic variances attributable to different SNP groups based on their biological and functional role via systems genetics / biology approaches. We apply different methods to group SNPs: (i) functional relevance of SNPs for obesity based on data mining, (ii) groups based on positions in...... the genome, and significance based on previous genome-wide association study in the same data set. Preliminary results from our studies in production pigs indicate that BPL models have higher accuracy but more bias than GBLUP method, although using different power parameters has no effect on...

  1. Bayesian model-based approach for developing a river water quality index

    Ali, Zalina Mohd; Ibrahim, Noor Akma; Mengersen, Kerrie; Shitan, Mahendran; Juahir, Hafizan


    Six main pollutants have been previously identified by expert opinion to determine river condition in Malaysia. The pollutants were Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Suspended Solid (SS), potential of Hydrogen (pH) and Ammonia (AN). The selected variables together with the respective weights have been applied to calculate the water quality index of all rivers in Malaysia. However, the relative weights established in DOE-WQI formula are subjective in nature and not unanimously agreed upon, as indicated by different weight being proposed for the same variables by various panels of experts. Focusing on the Langat River, a Bayesian model-based approach was introduced for the first time in this study to obtain new objective relative weights. The new weights used in WQI calculation are shown to be capable of capturing similar distributions in water quality compared with the existing DOE-WQI.

  2. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

    Noviyanti, Lienda


    All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

  3. Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion

    Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.


    Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.

  4. Uncertainty analysis of pollutant build-up modelling based on a Bayesian weighted least squares approach.

    Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha


    Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality datasets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares regression and Bayesian weighted least squares regression for the estimation of uncertainty associated with pollutant build-up prediction using limited datasets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling. PMID:23454702

  5. A Bayesian network approach for modeling local failure in lung cancer

    Oh, Jung Hun; Craft, Jeffrey; Al Lozi, Rawan; Vaidya, Manushka; Meng, Yifan; Deasy, Joseph O; Bradley, Jeffrey D; El Naqa, Issam, E-mail: [Department of Radiation Oncology, Mallinckrodt Institute of Radiology, Washington University School of Medicine, MO 63110 (United States)


    Locally advanced non-small cell lung cancer (NSCLC) patients suffer from a high local failure rate following radiotherapy. Despite many efforts to develop new dose-volume models for early detection of tumor local failure, there was no reported significant improvement in their application prospectively. Based on recent studies of biomarker proteins' role in hypoxia and inflammation in predicting tumor response to radiotherapy, we hypothesize that combining physical and biological factors with a suitable framework could improve the overall prediction. To test this hypothesis, we propose a graphical Bayesian network framework for predicting local failure in lung cancer. The proposed approach was tested using two different datasets of locally advanced NSCLC patients treated with radiotherapy. The first dataset was collected retrospectively, which comprises clinical and dosimetric variables only. The second dataset was collected prospectively in which in addition to clinical and dosimetric information, blood was drawn from the patients at various time points to extract candidate biomarkers as well. Our preliminary results show that the proposed method can be used as an efficient method to develop predictive models of local failure in these patients and to interpret relationships among the different variables in the models. We also demonstrate the potential use of heterogeneous physical and biological variables to improve the model prediction. With the first dataset, we achieved better performance compared with competing Bayesian-based classifiers. With the second dataset, the combined model had a slightly higher performance compared to individual physical and biological models, with the biological variables making the largest contribution. Our preliminary results highlight the potential of the proposed integrated approach for predicting post-radiotherapy local failure in NSCLC patients.

  6. What would judgment and decision making research be like if we took a Bayesian approach to hypothesis testing?

    William J. Matthews


    Full Text Available Judgment and decision making research overwhelmingly uses null hypothesis significance testing as the basis for statistical inference. This article examines an alternative, Bayesian approach which emphasizes the choice between two competing hypotheses and quantifies the balance of evidence provided by the data---one consequence of which is that experimental results may be taken to strongly favour the null hypothesis. We apply a recently-developed ``Bayesian $t$-test'' to existing studies of the anchoring effect in judgment, and examine how the change in approach affects both the tone of hypothesis testing and the substantive conclusions that one draws. We compare the Bayesian approach with Fisherian and Neyman-Pearson testing, examining its relationship to conventional $p$-values, the influence of effect size, and the importance of prior beliefs about the likely state of nature. The results give a sense of how Bayesian hypothesis testing might be applied to judgment and decision making research, and of both the advantages and challenges that a shift to this approach would entail.

  7. A study of finite mixture model: Bayesian approach on financial time series data

    Phoong, Seuk-Yen; Ismail, Mohd Tahir


    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  8. Controlling the degree of caution in statistical inference with the Bayesian and frequentist approaches as opposite extremes

    Bickel, David R


    In statistical practice, whether a Bayesian or frequentist approach is used in inference depends not only on the availability of prior information but also on the attitude taken toward partial prior information, with frequentists tending to be more cautious than Bayesians. The proposed framework defines that attitude in terms of a specified amount of caution, thereby enabling data analysis at the level of caution desired and on the basis of any prior information. The caution parameter represents the attitude toward partial prior information in much the same way as a loss function represents the attitude toward risk. When there is very little prior information and nonzero caution, the resulting inferences correspond to those of the candidate confidence intervals and p-values that are most similar to the credible intervals and hypothesis probabilities of the specified Bayesian posterior. On the other hand, in the presence of a known physical distribution of the parameter, inferences are based only on the corres...

  9. On merging rainfall data from diverse sources using a Bayesian approach

    Bhattacharya, Biswa; Tarekegn, Tegegne


    Numerous studies have presented comparison of satellite rainfall products, such as from Tropical Rainfall Measuring Mission (TRMM), with rain gauge data and have concluded, in general, that the two sources of data are comparable at suitable space and time scales. The comparison is not a straightforward one as they employ different measurement techniques and are dependent on very different space-time scales of measurements. The number of available gauges in a catchment also influences the comparability and thus adds to the complexity. The TRMM rainfall data also has been directly used in hydrological modelling. As the space-time scale reduces so does the accuracy of these models. It seems that combining the two sources of rainfall data, or more sources of rainfall data, can enormously benefit hydrological studies. Various rainfall data, due to the differences in their space-time structure, contains information about the spatio-temporal distribution of rainfall, which is not available to a single source of data. In order to harness this benefit we have developed a method of merging these two (or more) rainfall products under the framework of Bayesian Data Fusion (BDF) principle. By applying this principle the rainfall data from the various sources can be combined to a single time series of rainfall data. The usefulness of the approach has been explored in a case study on Lake Tana Basin of Upper Blue Nile Basin in Ethiopia. A 'leave one rain gauge out' cross validation technique was employed for evaluating the accuracy of the rainfall time series with rainfall interpolated from rain gauge data using Inverse Distance Weighting (referred to as IDW), TRMM and the fused data (BDF). The result showed that BDF prediction was better compared to the TRMM and IDW. Further evaluation of the three rainfall estimates was done by evaluating the capability in predicting observed stream flow using a lumped conceptual rainfall-runoff model using NAM. Visual inspection of the

  10. Bayesian Recovery of Clipped OFDM Signals: A Receiver-based Approach

    Al-Rabah, Abdullatif R.


    Recently, orthogonal frequency-division multiplexing (OFDM) has been adopted for high-speed wireless communications due to its robustness against multipath fading. However, one of the main fundamental drawbacks of OFDM systems is the high peak-to-average-power ratio (PAPR). Several techniques have been proposed for PAPR reduction. Most of these techniques require transmitter-based (pre-compensated) processing. On the other hand, receiver-based alternatives would save the power and reduce the transmitter complexity. By keeping this in mind, a possible approach is to limit the amplitude of the OFDM signal to a predetermined threshold and equivalently a sparse clipping signal is added. Then, estimating this clipping signal at the receiver to recover the original signal. In this work, we propose a Bayesian receiver-based low-complexity clipping signal recovery method for PAPR reduction. The method is able to i) effectively reduce the PAPR via simple clipping scheme at the transmitter side, ii) use Bayesian recovery algorithm to reconstruct the clipping signal at the receiver side by measuring part of subcarriers, iii) perform well in the absence of statistical information about the signal (e.g. clipping level) and the noise (e.g. noise variance), and at the same time iv is energy efficient due to its low complexity. Specifically, the proposed recovery technique is implemented in data-aided based. The data-aided method collects clipping information by measuring reliable 
data subcarriers, thus makes full use of spectrum for data transmission without the need for tone reservation. The study is extended further to discuss how to improve the recovery of the clipping signal utilizing some features of practical OFDM systems i.e., the oversampling and the presence of multiple receivers. Simulation results demonstrate the superiority of the proposed technique over other recovery algorithms. The overall objective is to show that the receiver-based Bayesian technique is highly

  11. A user-friendly forest model with a multiplicative mathematical structure: a Bayesian approach to calibration

    M. Bagnara


    Full Text Available Forest models are being increasingly used to study ecosystem functioning, through the reproduction of carbon fluxes and productivity in very different forests all over the world. Over the last two decades, the need for simple and "easy to use" models for practical applications, characterized by few parameters and equations, has become clear, and some have been developed for this purpose. These models aim to represent the main drivers underlying forest ecosystem processes while being applicable to the widest possible range of forest ecosystems. Recently, it has also become clear that model performance should not be assessed only in terms of accuracy of estimations and predictions, but also in terms of estimates of model uncertainties. Therefore, the Bayesian approach has increasingly been applied to calibrate forest models, with the aim of estimating the uncertainty of their results, and of comparing their performances. Some forest models, considered to be user-friendly, rely on a multiplicative or quasi-multiplicative mathematical structure, which is known to cause problems during the calibration process, mainly due to high correlations between parameters. In a Bayesian framework using a Markov Chain Monte Carlo sampling this is likely to impair the reaching of a proper convergence of the chains and the sampling from the correct posterior distribution. Here we show two methods to reach proper convergence when using a forest model with a multiplicative structure, applying different algorithms with different number of iterations during the Markov Chain Monte Carlo or a two-steps calibration. The results showed that recently proposed algorithms for adaptive calibration do not confer a clear advantage over the Metropolis–Hastings Random Walk algorithm for the forest model used here. Moreover, the calibration remains time consuming and mathematically difficult, so advantages of using a fast and user-friendly model can be lost due to the calibration

  12. Modelling developmental instability as the joint action of noise and stability: a Bayesian approach

    Lens Luc


    Full Text Available Abstract Background Fluctuating asymmetry is assumed to measure individual and population level developmental stability. The latter may in turn show an association with stress, which can be observed through asymmetry-stress correlations. However, the recent literature does not support an ubiquitous relationship. Very little is known why some studies show relatively strong associations while others completely fail to find such a correlation. We propose a new Bayesian statistical framework to examine these associations Results We are considering developmental stability – i.e. the individual buffering capacity – as the biologically relevant trait and show that (i little variation in developmental stability can explain observed variation in fluctuating asymmetry when the distribution of developmental stability is highly skewed, and (ii that a previously developed tool (i.e. the hypothetical repeatability of fluctuating asymmetry contains only limited information about variation in developmental stability, which stands in sharp contrast to the earlier established close association between the repeatability and developmental instability. Conclusion We provide tools to generate valuable information about the distribution of between-individual variation in developmental stability. A simple linear transformation of a previous model lead to completely different conclusions. Thus, theoretical modelling of asymmetry and stability appears to be very sensitive to the scale of inference. More research is urgently needed to get better insights in the developmental mechanisms of noise and stability. In spite of the fact that the model is likely to represent an oversimplification of reality, the accumulation of new insights could be incorporated in the Bayesian statistical approach to obtain more reliable estimation.

  13. A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah


    Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.

  14. Meta-analysis for 2 x 2 tables: a Bayesian approach.

    Carlin, J B


    This paper develops and implements a fully Bayesian approach to meta-analysis, in which uncertainty about effects in distinct but comparable studies is represented by an exchangeable prior distribution. Specifically, hierarchical normal models are used, along with a parametrization that allows a unified approach to deal easily with both clinical trial and case-control study data. Monte Carlo methods are used to obtain posterior distributions for parameters of interest, integrating out the unknown parameters of the exchangeable prior or 'random effects' distribution. The approach is illustrated with two examples, the first involving a data set on the effect of beta-blockers after myocardial infarction, and the second based on a classic data set comprising 14 case-control studies on the effects of smoking on lung cancer. In both examples, rather different conclusions from those previously published are obtained. In particular, it is claimed that widely used methods for meta-analysis, which involve complete pooling of 'O-E' values, lead to understatement of uncertainty in the estimation of overall or typical effect size. PMID:1349763

  15. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach

  16. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    Chan, M.T. [Univ. of Southern California, Los Angeles, CA (United States); Herman, G.T. [Univ. of Pennsylvania, Philadelphia, PA (United States); Levitan, E. [Technion, Haifa (Israel)


    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an {open_quotes}image-modeling{close_quotes} prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach.

  17. A Bayesian Game-Theoretic Approach for Distributed Resource Allocation in Fading Multiple Access Channels

    Gaoning He


    Full Text Available A Bayesian game-theoretic model is developed to design and analyze the resource allocation problem in K-user fading multiple access channels (MACs, where the users are assumed to selfishly maximize their average achievable rates with incomplete information about the fading channel gains. In such a game-theoretic study, the central question is whether a Bayesian equilibrium exists, and if so, whether the network operates efficiently at the equilibrium point. We prove that there exists exactly one Bayesian equilibrium in our game. Furthermore, we study the network sum-rate maximization problem by assuming that the users coordinate according to a symmetric strategy profile. This result also serves as an upper bound for the Bayesian equilibrium. Finally, simulation results are provided to show the network efficiency at the unique Bayesian equilibrium and to compare it with other strategies.

  18. Cyclist activity and injury risk analysis at signalized intersections: a Bayesian modelling approach.

    Strauss, Jillian; Miranda-Moreno, Luis F; Morency, Patrick


    This study proposes a two-equation Bayesian modelling approach to simultaneously study cyclist injury occurrence and bicycle activity at signalized intersections as joint outcomes. This approach deals with the potential presence of endogeneity and unobserved heterogeneities and is used to identify factors associated with both cyclist injuries and volumes. Its application to identify high-risk corridors is also illustrated. Montreal, Quebec, Canada is the application environment, using an extensive inventory of a large sample of signalized intersections containing disaggregate motor-vehicle traffic volumes and bicycle flows, geometric design, traffic control and built environment characteristics in the vicinity of the intersections. Cyclist injury data for the period of 2003-2008 is used in this study. Also, manual bicycle counts were standardized using temporal and weather adjustment factors to obtain average annual daily volumes. Results confirm and quantify the effects of both bicycle and motor-vehicle flows on cyclist injury occurrence. Accordingly, more cyclists at an intersection translate into more cyclist injuries but lower injury rates due to the non-linear association between bicycle volume and injury occurrence. Furthermore, the results emphasize the importance of turning motor-vehicle movements. The presence of bus stops and total crosswalk length increase cyclist injury occurrence whereas the presence of a raised median has the opposite effect. Bicycle activity through intersections was found to increase as employment, number of metro stations, land use mix, area of commercial land use type, length of bicycle facilities and the presence of schools within 50-800 m of the intersection increase. Intersections with three approaches are expected to have fewer cyclists than those with four. Using Bayesian analysis, expected injury frequency and injury rates were estimated for each intersection and used to rank corridors. Corridors with high bicycle volumes

  19. A multinomial logit model-Bayesian network hybrid approach for driver injury severity analyses in rear-end crashes.

    Chen, Cong; Zhang, Guohui; Tarefder, Rafiqul; Ma, Jianming; Wei, Heng; Guan, Hongzhi


    Rear-end crash is one of the most common types of traffic crashes in the U.S. A good understanding of its characteristics and contributing factors is of practical importance. Previously, both multinomial Logit models and Bayesian network methods have been used in crash modeling and analysis, respectively, although each of them has its own application restrictions and limitations. In this study, a hybrid approach is developed to combine multinomial logit models and Bayesian network methods for comprehensively analyzing driver injury severities in rear-end crashes based on state-wide crash data collected in New Mexico from 2010 to 2011. A multinomial logit model is developed to investigate and identify significant contributing factors for rear-end crash driver injury severities classified into three categories: no injury, injury, and fatality. Then, the identified significant factors are utilized to establish a Bayesian network to explicitly formulate statistical associations between injury severity outcomes and explanatory attributes, including driver behavior, demographic features, vehicle factors, geometric and environmental characteristics, etc. The test results demonstrate that the proposed hybrid approach performs reasonably well. The Bayesian network reference analyses indicate that the factors including truck-involvement, inferior lighting conditions, windy weather conditions, the number of vehicles involved, etc. could significantly increase driver injury severities in rear-end crashes. The developed methodology and estimation results provide insights for developing effective countermeasures to reduce rear-end crash injury severities and improve traffic system safety performance. PMID:25888994

  20. A Bayesian approach for solar resource potential assessment using satellite images

    The need for a more sustainable and more protective development opens new possibilities for renewable energy. Among the different renewable energy sources, the direct conversion of sunlight into electricity by solar photovoltaic (PV) technology seems to be the most promising and represents a technically viable solution to energy demands. But implantation and deployment of PV energy need solar resource data for utility planning, accommodating grid capacity, and formulating future adaptive policies. Currently, the best approach to determine the solar resource at a given site is based on the use of satellite images. However, the computation of solar resource (non-linear process) from satellite images is unfortunately not straightforward. From a signal processing point of view, it falls within non-stationary, non-linear/non-Gaussian dynamical inverse problems. In this paper, we propose a Bayesian approach combining satellite images and in situ data. We propose original observation and transition functions taking advantages of the characteristics of both the involved type of data. A simulation study of solar irradiance is carried along with this method and a French Guiana solar resource potential map for year 2010 is given

  1. Applications of Bayesian approach in modelling risk of malaria-related hospital mortality

    Simbeye Jupiter S


    Full Text Available Abstract Background Malaria is a major public health problem in Malawi, however, quantifying its burden in a population is a challenge. Routine hospital data provide a proxy for measuring the incidence of severe malaria and for crudely estimating morbidity rates. Using such data, this paper proposes a method to describe trends, patterns and factors associated with in-hospital mortality attributed to the disease. Methods We develop semiparametric regression models which allow joint analysis of nonlinear effects of calendar time and continuous covariates, spatially structured variation, unstructured heterogeneity, and other fixed covariates. Modelling and inference use the fully Bayesian approach via Markov Chain Monte Carlo (MCMC simulation techniques. The methodology is applied to analyse data arising from paediatric wards in Zomba district, Malawi, between 2002 and 2003. Results and Conclusion We observe that the risk of dying in hospital is lower in the dry season, and for children who travel a distance of less than 5 kms to the hospital, but increases for those who are referred to the hospital. The results also indicate significant differences in both structured and unstructured spatial effects, and the health facility effects reveal considerable differences by type of facility or practice. More importantly, our approach shows non-linearities in the effect of metrical covariates on the probability of dying in hospital. The study emphasizes that the methodological framework used provides a useful tool for analysing the data at hand and of similar structure.

  2. Estimating basal properties of glaciers from surface measurements: a non-linear Bayesian inversion approach

    M. J. Raymond


    Full Text Available We propose a new approach to indirectly estimate basal properties of a glacier, i.e. bedrock topography and basal slipperiness, from observations of surface topography and surface velocities. We demonstrate how a maximum a posteriori estimate of basal conditions can be determined using a Bayesian inference approach in a combination with an analytical linearisation of the forward model. Using synthetic data we show that for non-linear media and non-linear sliding law only a few forward-step model evaluations are needed for convergence. The forward step is solved with a numerical finite-element model using the full Stokes equations. Forward Fréchet derivatives are approximated through analytical small-perturbation solutions. This approximation is a key feature of the method and the effects of this approximation on model performance are analyzed. The number of iterations needed for convergence increases with the amplitude of the basal perturbations, but generally less than ten iterations are needed.

  3. Using Bayesian network and AHP method as a marketing approach tools in defining tourists’ preferences

    Nataša Papić-Blagojević


    Full Text Available Marketing approach is associated to market conditions and achieving long term profitability of a company by satisfying consumers’ needs. This approach in tourism does not have to be related only to promoting one touristic destination, but is associated to relation between travel agency and its clients too. It considers that travel agencies adjust their offers to their clients’ needs. In that sense, it is important to analyze the behavior of tourists in the earlier periods with consideration of their preferences. Using Bayesian network, it could be graphically displayed the connection between tourists who have similar taste and relationships between them. On the other hand, the analytic hierarchy process (AHP is used to rank tourist attractions, with also relying on past experience. In this paper we examine possible applications of these two models in tourism in Serbia. The example is hypothetical, but it will serve as a base for future research. Three types of tourism are chosen as a representative in Vojvodina: Cultural, Rural and Business tourism, because they are the bright spot of touristic development in this area. Applied on these forms, analytic hierarchy process has shown its strength in predicting tourists’ preferences.

  4. Application of the Bayesian approach for derivation of PDFs for concentration ratio values

    Concentration ratios (CRs) are used to derive activity concentrations in wild plants and animals. Usually, compilations of CR values encompass a wide range of element–organism combinations, extracted from different studies with statistical information reported at varying degrees of detail. To produce a more robust estimation of distribution parameters, data from different studies are normally pooled using classical statistical methods. However, there is inherent subjectivity involved in pooling CR data in the sense that there is a tacit assumption that the CRs under any arbitrarily defined biota category belong to the same population. Here, Bayesian inference has been introduced as an alternative way of making estimates of distribution parameters of CRs. This approach, in contrast to classical methods, is more flexible and also allows us to define the various assumptions required, when combining data, in a more explicit manner. Taking selected data from the recently compiled wildlife transfer database ( ( as a working example, attempts are made to refine the pooling approaches previously used and to consider situations when empirical data are limited

  5. A Bayesian approach to efficient differential allocation for resampling-based significance testing

    Soi Sameer


    Full Text Available Abstract Background Large-scale statistical analyses have become hallmarks of post-genomic era biological research due to advances in high-throughput assays and the integration of large biological databases. One accompanying issue is the simultaneous estimation of p-values for a large number of hypothesis tests. In many applications, a parametric assumption in the null distribution such as normality may be unreasonable, and resampling-based p-values are the preferred procedure for establishing statistical significance. Using resampling-based procedures for multiple testing is computationally intensive and typically requires large numbers of resamples. Results We present a new approach to more efficiently assign resamples (such as bootstrap samples or permutations within a nonparametric multiple testing framework. We formulated a Bayesian-inspired approach to this problem, and devised an algorithm that adapts the assignment of resamples iteratively with negligible space and running time overhead. In two experimental studies, a breast cancer microarray dataset and a genome wide association study dataset for Parkinson's disease, we demonstrated that our differential allocation procedure is substantially more accurate compared to the traditional uniform resample allocation. Conclusion Our experiments demonstrate that using a more sophisticated allocation strategy can improve our inference for hypothesis testing without a drastic increase in the amount of computation on randomized data. Moreover, we gain more improvement in efficiency when the number of tests is large. R code for our algorithm and the shortcut method are available at

  6. Bayesian approach to spectral function reconstruction for Euclidean quantum field theories.

    Burnier, Yannis; Rothkopf, Alexander


    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33T(C). PMID:24237510

  7. A Bayesian reliability approach to the performance assessment of a geological waste repository

    This paper discusses the task of selecting a suitable site for a high-level waste disposal repository (HLWR) which certainly is a complex one in that one must address both engineering and economic factors of the proposed facility and site as well as environmental, public health, safety, and sociopolitical factors. Acknowledging the complexity of the siting problem for a HLWR leads one to readily conclude that a formal analysis, including the use of a performance assessment model (PAM), is needed to assist the designated decision makers in their task of selecting a suitable site. The overall goal of a PAM is to aid the decision makers in making the best possible technical siting decision. For a number of reason, the authors believe that the combining of both Bayesian decision theory and reliability methodology provides the best approach to constructing a useful PAM for assisting in the siting of a HLWR. This combination allows one to formally integrate existing relevant information, professional judgement, and component model outputs to produce conditionally estimated probabilities for a decision tree approach to the radionuclide release problem of a proposed HLWR. If loss functions are available, this also allows one to calculate the expected costs or losses from possible radionuclide releases. This latter calculation may be very important in selecting the final site from among a number of alternative sites

  8. Bayesian forecasting of temporal gene expression by using an autoregressive panel data approach.

    Nascimento, M; E Silva, F F; Sáfadi, T; Nascimento, A C C; Barroso, L M A; Glória, L S; de S Carvalho, B


    We propose and evaluate a novel approach for forecasting gene expression over non-observed times in longitudinal trials under a Bayesian viewpoint. One of the aims is to cluster genes that share similar expression patterns over time and then use this similarity to predict relative expression at time points of interest. Expression values of 106 genes expressed during the cell cycle of Saccharomyces cerevisiae were used and genes were partitioned into five distinct clusters of sizes 33, 32, 21, 16, and 4. After removing the last observed time point, the agreements of signals (upregulated or downregulated) considering the predicted expression level were 72.7, 81.3, 76.2, 68.8, and 50.0%, respectively, for each cluster. The percentage of credibility intervals that contained the true values of gene expression for a future time was ~90%. The methodology performed well, providing a valid forecast of gene expression values by fitting an autoregressive panel data model. This approach is easily implemented with other time-series models and when Poisson and negative binomial probability distributions are assumed for the gene expression data. PMID:27323205

  9. Real-time eruption forecasting using the material Failure Forecast Method with a Bayesian approach

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.


    Many attempts for deterministic forecasting of eruptions and landslides have been performed using the material Failure Forecast Method (FFM). This method consists in adjusting an empirical power law on precursory patterns of seismicity or deformation. Until now, most of the studies have presented hindsight forecasts based on complete time series of precursors and do not evaluate the ability of the method for carrying out real-time forecasting with partial precursory sequences. In this study, we present a rigorous approach of the FFM designed for real-time applications on volcano-seismic precursors. We use a Bayesian approach based on the FFM theory and an automatic classification of seismic events. The probability distributions of the data deduced from the performance of this classification are used as input. As output, it provides the probability of the forecast time at each observation time before the eruption. The spread of the a posteriori probability density function of the prediction time and its stability with respect to the observation time are used as criteria to evaluate the reliability of the forecast. We test the method on precursory accelerations of long-period seismicity prior to vulcanian explosions at Volcán de Colima (Mexico). For explosions preceded by a single phase of seismic acceleration, we obtain accurate and reliable forecasts using approximately 80% of the whole precursory sequence. It is, however, more difficult to apply the method to multiple acceleration patterns.

  10. A Bayesian approach to analyzing phenotype microarray data enables estimation of microbial growth parameters.

    Gerstgrasser, Matthias; Nicholls, Sarah; Stout, Michael; Smart, Katherine; Powell, Chris; Kypraios, Theodore; Stekel, Dov


    Biolog phenotype microarrays (PMs) enable simultaneous, high throughput analysis of cell cultures in different environments. The output is high-density time-course data showing redox curves (approximating growth) for each experimental condition. The software provided with the Omnilog incubator/reader summarizes each time-course as a single datum, so most of the information is not used. However, the time courses can be extremely varied and often contain detailed qualitative (shape of curve) and quantitative (values of parameters) information. We present a novel, Bayesian approach to estimating parameters from Phenotype Microarray data, fitting growth models using Markov Chain Monte Carlo (MCMC) methods to enable high throughput estimation of important information, including length of lag phase, maximal "growth" rate and maximum output. We find that the Baranyi model for microbial growth is useful for fitting Biolog data. Moreover, we introduce a new growth model that allows for diauxic growth with a lag phase, which is particularly useful where Phenotype Microarrays have been applied to cells grown in complex mixtures of substrates, for example in industrial or biotechnological applications, such as worts in brewing. Our approach provides more useful information from Biolog data than existing, competing methods, and allows for valuable comparisons between data series and across different models. PMID:26762475

  11. A Bayesian approach to the characterization of electroencephalographic recordings in premature infants

    Mitchell, Timothy J.

    Preterm infants are particularly susceptible to cerebral injury, and electroencephalographic (EEG) recordings provide an important diagnostic tool for determining cerebral health. However, interpreting these EEG recordings is challenging and requires the skills of a trained electroencephalographer. Because these EEG specialists are rare, an automated interpretation of newborn EEG recordings would increase access to an important diagnostic tool for physicians. To automate this procedure, we employ a novel Bayesian approach to compute the probability of EEG features (waveforms) including suppression, delta brushes, and delta waves. The power of this approach lies not only in its ability to closely mimic the techniques used by EEG specialists, but also its ability to be generalized to identify other waveforms that may be of interest for future work. The results of these calculations are used in a program designed to output simple statistics related to the presence or absence of such features. Direct comparison of the software with expert human readers has indicated satisfactory performance, and the algorithm has shown promise in its ability to distinguish between infants with normal neurodevelopmental outcome and those with poor neurodevelopmental outcome.

  12. Finding the optimal statistical model to describe target motion during radiotherapy delivery—a Bayesian approach

    Herschtal, A.; Foroudi, F.; Greer, P. B.; Eade, T. N.; Hindson, B. R.; Kron, T.


    Early approaches to characterizing errors in target displacement during a fractionated course of radiotherapy assumed that the underlying fraction-to-fraction variability in target displacement, known as the ‘treatment error’ or ‘random error’, could be regarded as constant across patients. More recent approaches have modelled target displacement allowing for differences in random error between patients. However, until recently it has not been feasible to compare the goodness of fit of alternate models of random error rigorously. This is because the large volumes of real patient data necessary to distinguish between alternative models have only very recently become available. This work uses real-world displacement data collected from 365 patients undergoing radical radiotherapy for prostate cancer to compare five candidate models for target displacement. The simplest model assumes constant random errors across patients, while other models allow for random errors that vary according to one of several candidate distributions. Bayesian statistics and Markov Chain Monte Carlo simulation of the model parameters are used to compare model goodness of fit. We conclude that modelling the random error as inverse gamma distributed provides a clearly superior fit over all alternatives considered. This finding can facilitate more accurate margin recipes and correction strategies.

  13. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.


    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  14. Comparison of Bayesian and frequentist approaches in modelling risk of preterm birth near the Sydney Tar Ponds, Nova Scotia, Canada

    Canty Angelo


    Full Text Available Abstract Background This study compares the Bayesian and frequentist (non-Bayesian approaches in the modelling of the association between the risk of preterm birth and maternal proximity to hazardous waste and pollution from the Sydney Tar Pond site in Nova Scotia, Canada. Methods The data includes 1604 observed cases of preterm birth out of a total population of 17559 at risk of preterm birth from 144 enumeration districts in the Cape Breton Regional Municipality. Other covariates include the distance from the Tar Pond; the rate of unemployment to population; the proportion of persons who are separated, divorced or widowed; the proportion of persons who have no high school diploma; the proportion of persons living alone; the proportion of single parent families and average income. Bayesian hierarchical Poisson regression, quasi-likelihood Poisson regression and weighted linear regression models were fitted to the data. Results The results of the analyses were compared together with their limitations. Conclusion The results of the weighted linear regression and the quasi-likelihood Poisson regression agrees with the result from the Bayesian hierarchical modelling which incorporates the spatial effects.

  15. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan


    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. PMID:27095416

  16. A robust Bayesian approach to modeling epistemic uncertainty in common-cause failure models

    In a standard Bayesian approach to the alpha-factor model for common-cause failure, a precise Dirichlet prior distribution models epistemic uncertainty in the alpha-factors. This Dirichlet prior is then updated with observed data to obtain a posterior distribution, which forms the basis for further inferences. In this paper, we adapt the imprecise Dirichlet model of Walley to represent epistemic uncertainty in the alpha-factors. In this approach, epistemic uncertainty is expressed more cautiously via lower and upper expectations for each alpha-factor, along with a learning parameter which determines how quickly the model learns from observed data. For this application, we focus on elicitation of the learning parameter, and find that values in the range of 1 to 10 seem reasonable. The approach is compared with Kelly and Atwood's minimally informative Dirichlet prior for the alpha-factor model, which incorporated precise mean values for the alpha-factors, but which was otherwise quite diffuse. Next, we explore the use of a set of Gamma priors to model epistemic uncertainty in the marginal failure rate, expressed via a lower and upper expectation for this rate, again along with a learning parameter. As zero counts are generally less of an issue here, we find that the choice of this learning parameter is less crucial. Finally, we demonstrate how both epistemic uncertainty models can be combined to arrive at lower and upper expectations for all common-cause failure rates. Thereby, we effectively provide a full sensitivity analysis of common-cause failure rates, properly reflecting epistemic uncertainty of the analyst on all levels of the common-cause failure model

  17. Bayesian approaches to reverse engineer cellular systems: a simulation study on nonlinear Gaussian networks

    Ramoni Marco F


    Full Text Available Abstract Background Reverse engineering cellular networks is currently one of the most challenging problems in systems biology. Dynamic Bayesian networks (DBNs seem to be particularly suitable for inferring relationships between cellular variables from the analysis of time series measurements of mRNA or protein concentrations. As evaluating inference results on a real dataset is controversial, the use of simulated data has been proposed. However, DBN approaches that use continuous variables, thus avoiding the information loss associated with discretization, have not yet been extensively assessed, and most of the proposed approaches have dealt with linear Gaussian models. Results We propose a generalization of dynamic Gaussian networks to accommodate nonlinear dependencies between variables. As a benchmark dataset to test the new approach, we used data from a mathematical model of cell cycle control in budding yeast that realistically reproduces the complexity of a cellular system. We evaluated the ability of the networks to describe the dynamics of cellular systems and their precision in reconstructing the true underlying causal relationships between variables. We also tested the robustness of the results by analyzing the effect of noise on the data, and the impact of a different sampling time. Conclusion The results confirmed that DBNs with Gaussian models can be effectively exploited for a first level analysis of data from complex cellular systems. The inferred models are parsimonious and have a satisfying goodness of fit. Furthermore, the networks not only offer a phenomenological description of the dynamics of cellular systems, but are also able to suggest hypotheses concerning the causal interactions between variables. The proposed nonlinear generalization of Gaussian models yielded models characterized by a slightly lower goodness of fit than the linear model, but a better ability to recover the true underlying connections between

  18. A Robust Bayesian Approach to an Optimal Replacement Policy for Gas Pipelines

    José Pablo Arias-Nicolás


    Full Text Available In the paper, we address Bayesian sensitivity issues when integrating experts’ judgments with available historical data in a case study about strategies for the preventive maintenance of low-pressure cast iron pipelines in an urban gas distribution network. We are interested in replacement priorities, as determined by the failure rates of pipelines deployed under different conditions. We relax the assumptions, made in previous papers, about the prior distributions on the failure rates and study changes in replacement priorities under different choices of generalized moment-constrained classes of priors. We focus on the set of non-dominated actions, and among them, we propose the least sensitive action as the optimal choice to rank different classes of pipelines, providing a sound approach to the sensitivity problem. Moreover, we are also interested in determining which classes have a failure rate exceeding a given acceptable value, considered as the threshold determining no need for replacement. Graphical tools are introduced to help decisionmakers to determine if pipelines are to be replaced and the corresponding priorities.

  19. Monitoring schistosomiasis risk in East China over space and time using a Bayesian hierarchical modeling approach.

    Hu, Yi; Ward, Michael P; Xia, Congcong; Li, Rui; Sun, Liqian; Lynn, Henry; Gao, Fenghua; Wang, Qizhi; Zhang, Shiqing; Xiong, Chenglong; Zhang, Zhijie; Jiang, Qingwu


    Schistosomiasis remains a major public health problem and causes substantial economic impact in east China, particularly along the Yangtze River Basin. Disease forecasting and surveillance can assist in the development and implementation of more effective intervention measures to control disease. In this study, we applied a Bayesian hierarchical spatio-temporal model to describe trends in schistosomiasis risk in Anhui Province, China, using annual parasitological and environmental data for the period 1997-2010. A computationally efficient approach-Integrated Nested Laplace Approximation-was used for model inference. A zero-inflated, negative binomial model best described the spatio-temporal dynamics of schistosomiasis risk. It predicted that the disease risk would generally be low and stable except for some specific, local areas during the period 2011-2014. High-risk counties were identified in the forecasting maps: three in which the risk remained high, and two in which risk would become high. The results indicated that schistosomiasis risk has been reduced to consistently low levels throughout much of this region of China; however, some counties were identified in which progress in schistosomiasis control was less than satisfactory. Whilst maintaining overall control, specific interventions in the future should focus on these refractive counties as part of a strategy to eliminate schistosomiasis from this region. PMID:27053447

  20. Oral intercourse or secondary transfer? A Bayesian approach of salivary amylase and foreign DNA findings.

    Breathnach, Michelle; Moore, Elizabeth


    The Bayesian Approach allows forensic scientists to evaluate the significance of scientific evidence in light of two conflicting hypothesis. This aids the investigator to calculate a numerical value of the probability that the scientific findings support one hypothesis over conflicting opinions. In the case where oral intercourse is alleged, α-amylase, an indicator of saliva, is detected on penile swabs. The value of this finding is unknown as it may indicate the presence of saliva resulting from oral intercourse however it may also represent the presence of saliva due to innocent means such as background levels of salivary-α-amylase in the male population due to secondary transfer. Therefore, it is difficult to attach significance to this finding without background information and knowledge. A population study of the background levels of salivary-α-amylase was performed by analysing items of underwear worn under normal circumstances by 69 male volunteers. The Phadebas press test was used to screen the garments for amylase-containing stains and the positive areas were subjected to further confirmation of saliva by the RSID-Saliva kit. 44% of underwear screened had stains containing amylase. This study determined the background level of salivary-α-amylase and DNA on the inside front of male underwear which has potential implications on the interpretation of evidence in alleged oral intercourse. PMID:23683908

  1. A virtual experimental technique for data collection for a Bayesian network approach to human reliability analysis

    Bayesian network (BN) is a powerful tool for human reliability analysis (HRA) as it can characterize the dependency among different human performance shaping factors (PSFs) and associated actions. It can also quantify the importance of different PSFs that may cause a human error. Data required to fully quantify BN for HRA in offshore emergency situations are not readily available. For many situations, there is little or no appropriate data. This presents significant challenges to assign the prior and conditional probabilities that are required by the BN approach. To handle the data scarcity problem, this paper presents a data collection methodology using a virtual environment for a simplified BN model of offshore emergency evacuation. A two-level, three-factor experiment is used to collect human performance data under different mustering conditions. Collected data are integrated in the BN model and results are compared with a previous study. The work demonstrates that the BN model can assess the human failure likelihood effectively. Besides, the BN model provides the opportunities to incorporate new evidence and handle complex interactions among PSFs and associated actions

  2. Receiver-based recovery of clipped ofdm signals for papr reduction: A bayesian approach

    Ali, Anum


    Clipping is one of the simplest peak-to-average power ratio reduction schemes for orthogonal frequency division multiplexing (OFDM). Deliberately clipping the transmission signal degrades system performance, and clipping mitigation is required at the receiver for information restoration. In this paper, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the sparsity rate and noise variance for enhanced recovery. At the same time, the proposed scheme is robust against inaccurate estimates of the clipping signal statistics. The undistorted phase property of the clipped signal, as well as the clipping likelihood, is utilized for enhanced reconstruction. Furthermore, motivated by the nature of modern OFDM-based communication systems, we extend our clipping reconstruction approach to multiple antenna receivers and multi-user OFDM.We also address the problem of channel estimation from pilots contaminated by the clipping distortion. Numerical findings are presented that depict favorable results for the proposed scheme compared to the established sparse reconstruction schemes.

  3. Bridging Inter- and Intraspecific Trait Evolution with a Hierarchical Bayesian Approach.

    Kostikova, Anna; Silvestro, Daniele; Pearman, Peter B; Salamin, Nicolas


    The evolution of organisms is crucially dependent on the evolution of intraspecific variation. Its interactions with selective agents in the biotic and abiotic environments underlie many processes, such as intraspecific competition, resource partitioning and, eventually, species formation. Nevertheless, comparative models of trait evolution neither allow explicit testing of hypotheses related to the evolution of intraspecific variation nor do they simultaneously estimate rates of trait evolution by accounting for both trait mean and variance. Here, we present a model of phenotypic trait evolution using a hierarchical Bayesian approach that simultaneously incorporates interspecific and intraspecific variation. We assume that species-specific trait means evolve under a simple Brownian motion process, whereas species-specific trait variances are modeled with Brownian or Ornstein-Uhlenbeck processes. After evaluating the power of the method through simulations, we examine whether life-history traits impact evolution of intraspecific variation in the Eriogonoideae (buckwheat family, Polygonaceae). Our model is readily extendible to more complex scenarios of the evolution of inter- and intraspecific variation and presents a step toward more comprehensive comparative models for macroevolutionary studies. PMID:26911152

  4. A Bayesian Inferential Approach to Quantify the Transmission Intensity of Disease Outbreak

    Adiveppa S. Kadi


    Full Text Available Background. Emergence of infectious diseases like influenza pandemic (H1N1 2009 has become great concern, which posed new challenges to the health authorities worldwide. To control these diseases various studies have been developed in the field of mathematical modelling, which is useful tool for understanding the epidemiological dynamics and their dependence on social mixing patterns. Method. We have used Bayesian approach to quantify the disease outbreak through key epidemiological parameter basic reproduction number (R0, using effective contacts, defined as sum of the product of incidence cases and probability of generation time distribution. We have estimated R0 from daily case incidence data for pandemic influenza A/H1N1 2009 in India, for the initial phase. Result. The estimated R0 with 95% credible interval is consistent with several other studies on the same strain. Through sensitivity analysis our study indicates that infectiousness affects the estimate of R0. Conclusion. Basic reproduction number R0 provides the useful information to the public health system to do some effort in controlling the disease by using mitigation strategies like vaccination, quarantine, and so forth.

  5. Identification of failure type in corroded pipelines: a bayesian probabilistic approach.

    Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J


    Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. PMID:20378244

  6. A unified Bayesian semiparametric approach to assess discrimination ability in survival analysis.

    Zhao, Lili; Feng, Dai; Chen, Guoan; Taylor, Jeremy M G


    The discriminatory ability of a marker for censored survival data is routinely assessed by the time-dependent ROC curve and the c-index. The time-dependent ROC curve evaluates the ability of a biomarker to predict whether a patient lives past a particular time t. The c-index measures the global concordance of the marker and the survival time regardless of the time point. We propose a Bayesian semiparametric approach to estimate these two measures. The proposed estimators are based on the conditional distribution of the survival time given the biomarker and the empirical biomarker distribution. The conditional distribution is estimated by a linear-dependent Dirichlet process mixture model. The resulting ROC curve is smooth as it is estimated by a mixture of parametric functions. The proposed c-index estimator is shown to be more efficient than the commonly used Harrell's c-index since it uses all pairs of data rather than only informative pairs. The proposed estimators are evaluated through simulations and illustrated using a lung cancer dataset. PMID:26676324

  7. A Bayesian semiparametric approach with change points for spatial ordinal data.

    Cai, Bo; Lawson, Andrew B; McDermott, Suzanne; Aelion, C Marjorie


    The change-point model has drawn much attention over the past few decades. It can accommodate the jump process, which allows for changes of the effects before and after the change point. Intellectual disability is a long-term disability that impacts performance in cognitive aspects of life and usually has its onset prior to birth. Among many potential causes, soil chemical exposures are associated with the risk of intellectual disability in children. Motivated by a study for soil metal effects on intellectual disability, we propose a Bayesian hierarchical spatial model with change points for spatial ordinal data to detect the unknown threshold effects. The spatial continuous latent variable underlying the spatial ordinal outcome is modeled by the multivariate Gaussian process, which captures spatial variation and is centered at the nonlinear mean. The mean function is modeled by using the penalized smoothing splines for some covariates with unknown change points and the linear regression for the others. Some identifiability constraints are used to define the latent variable. A simulation example is presented to evaluate the performance of the proposed approach with the competing models. A retrospective cohort study for intellectual disability in South Carolina is used as an illustration. PMID:23070600

  8. A Bayesian Approach to Service Selection for Secondary Users in Cognitive Radio Networks

    Elaheh Homayounvala


    Full Text Available In cognitive radio networks where secondary users (SUs use the time-frequency gaps of primary users' (PUs licensed spectrum opportunistically, the experienced throughput of SUs depend not only on the traffic load of the PUs but also on the PUs' service type. Each service has its own pattern of channel usage, and if the SUs know the dominant pattern of primary channel usage, then they can make a better decision on choosing which service is better to be used at a specific time to get the best advantage of the primary channel, in terms of higher achievable throughput. However, it is difficult to inform directly SUs of PUs' dominant used services in each area, for practical reasons. This paper proposes a learning mechanism embedded in SUs to sense the primary channel for a specific length of time. This algorithm recommends the SUs upon sensing a free primary channel, to choose the best service in order to get the best performance, in terms of maximum achieved throughput and the minimum experienced delay. The proposed learning mechanism is based on a Bayesian approach that can predict the performance of a requested service for a given SU. Simulation results show that this service selection method outperforms the blind opportunistic SU service selection, significantly.

  9. Risk assessment of pre-hospital trauma airway management by anaesthesiologists using the predictive Bayesian approach

    Nakstad Anders R


    Full Text Available Abstract Introduction Endotracheal intubation (ETI has been considered an essential part of pre-hospital advanced life support. Pre-hospital ETI, however, is a complex intervention also for airway specialist like anaesthesiologists working as pre-hospital emergency physicians. We therefore wanted to investigate the quality of pre-hospital airway management by anaesthesiologists in severely traumatised patients and identify possible areas for improvement. Method We performed a risk assessment according to the predictive Bayesian approach, in a typical anaesthesiologist-manned Norwegian helicopter emergency medical service (HEMS. The main focus of the risk assessment was the event where a patient arrives in the emergency department without ETI despite a pre-hospital indication for it. Results In the risk assessment, we assigned a high probability (29% for the event assessed, that a patient arrives without ETI despite a pre-hospital indication. However, several uncertainty factors in the risk assessment were identified related to data quality, indications for use of ETI, patient outcome and need for special training of ETI providers. Conclusion Our risk assessment indicated a high probability for trauma patients with an indication for pre-hospital ETI not receiving it in the studied HEMS. The uncertainty factors identified in the assessment should be further investigated to better understand the problem assessed and consequences for the patients. Better quality of pre-hospital airway management data could contribute to a reduction of these uncertainties.

  10. A Parallel and Incremental Approach for Data-Intensive Learning of Bayesian Networks.

    Yue, Kun; Fang, Qiyu; Wang, Xiaoling; Li, Jin; Liu, Weiyi


    Bayesian network (BN) has been adopted as the underlying model for representing and inferring uncertain knowledge. As the basis of realistic applications centered on probabilistic inferences, learning a BN from data is a critical subject of machine learning, artificial intelligence, and big data paradigms. Currently, it is necessary to extend the classical methods for learning BNs with respect to data-intensive computing or in cloud environments. In this paper, we propose a parallel and incremental approach for data-intensive learning of BNs from massive, distributed, and dynamically changing data by extending the classical scoring and search algorithm and using MapReduce. First, we adopt the minimum description length as the scoring metric and give the two-pass MapReduce-based algorithms for computing the required marginal probabilities and scoring the candidate graphical model from sample data. Then, we give the corresponding strategy for extending the classical hill-climbing algorithm to obtain the optimal structure, as well as that for storing a BN by pairs. Further, in view of the dynamic characteristics of the changing data, we give the concept of influence degree to measure the coincidence of the current BN with new data, and then propose the corresponding two-pass MapReduce-based algorithms for BNs incremental learning. Experimental results show the efficiency, scalability, and effectiveness of our methods. PMID:25622335

  11. Bayesian probabilistic approach for predicting backbone structures in terms of protein blocks.

    de Brevern, A G; Etchebest, C; Hazout, S


    By using an unsupervised cluster analyzer, we have identified a local structural alphabet composed of 16 folding patterns of five consecutive C(alpha) ("protein blocks"). The dependence that exists between successive blocks is explicitly taken into account. A Bayesian approach based on the relation protein block-amino acid propensity is used for prediction and leads to a success rate close to 35%. Sharing sequence windows associated with certain blocks into "sequence families" improves the prediction accuracy by 6%. This prediction accuracy exceeds 75% when keeping the first four predicted protein blocks at each site of the protein. In addition, two different strategies are proposed: the first one defines the number of protein blocks in each site needed for respecting a user-fixed prediction accuracy, and alternatively, the second one defines the different protein sites to be predicted with a user-fixed number of blocks and a chosen accuracy. This last strategy applied to the ubiquitin conjugating enzyme (alpha/beta protein) shows that 91% of the sites may be predicted with a prediction accuracy larger than 77% considering only three blocks per site. The prediction strategies proposed improve our knowledge about sequence-structure dependence and should be very useful in ab initio protein modelling. PMID:11025540

  12. Fast Bayesian approach for modal identification using free vibration data, Part I - Most probable value

    Zhang, Feng-Liang; Ni, Yan-Chun; Au, Siu-Kui; Lam, Heung-Fai


    The identification of modal properties from field testing of civil engineering structures is becoming economically viable, thanks to the advent of modern sensor and data acquisition technology. Its demand is driven by innovative structural designs and increased performance requirements of dynamic-prone structures that call for a close cross-checking or monitoring of their dynamic properties and responses. Existing instrumentation capabilities and modal identification techniques allow structures to be tested under free vibration, forced vibration (known input) or ambient vibration (unknown broadband loading). These tests can be considered complementary rather than competing as they are based on different modeling assumptions in the identification model and have different implications on costs and benefits. Uncertainty arises naturally in the dynamic testing of structures due to measurement noise, sensor alignment error, modeling error, etc. This is especially relevant in field vibration tests because the test condition in the field environment can hardly be controlled. In this work, a Bayesian statistical approach is developed for modal identification using the free vibration response of structures. A frequency domain formulation is proposed that makes statistical inference based on the Fast Fourier Transform (FFT) of the data in a selected frequency band. This significantly simplifies the identification model because only the modes dominating the frequency band need to be included. It also legitimately ignores the information in the excluded frequency bands that are either irrelevant or difficult to model, thereby significantly reducing modeling error risk. The posterior probability density function (PDF) of the modal parameters is derived rigorously from modeling assumptions and Bayesian probability logic. Computational difficulties associated with calculating the posterior statistics, including the most probable value (MPV) and the posterior covariance matrix

  13. An Integrated Approach to Battery Health Monitoring using Bayesian Regression, Classification and State Estimation

    National Aeronautics and Space Administration — The application of the Bayesian theory of managing uncertainty and complexity to regression and classification in the form of Relevance Vector Machine (RVM), and to...

  14. Bayesian Approaches to Non-parametric Estimation of Densities on the Unit Interval

    Song Li; Silvapulle, Mervyn J.; Param Silvapulle; Xibin Zhang


    This paper investigates nonparametric estimation of density on [0,1]. The kernel estimator of density on [0,1] has been found to be sensitive to both bandwidth and kernel. This paper proposes a unified Bayesian framework for choosing both the bandwidth and kernel function. In a simulation study, the Bayesian bandwidth estimator performed better than others, and kernel estimators were sensitive to the choice of the kernel and the shapes of the population densities on [0,1]. The simulation and ...

  15. Operational risk modelling and organizational learning in structured finance operations: a Bayesian network approach

    Andrew Sanford; Imad Moosa


    This paper describes the development of a tool, based on a Bayesian network model, that provides posteriori predictions of operational risk events, aggregate operational loss distributions, and Operational Value-at-Risk, for a structured finance operations unit located within one of Australia's major banks. The Bayesian network, based on a previously developed causal framework, has been designed to model the smaller and more frequent, attritional operational loss events. Given the limited ava...

  16. Macroeconomic and credit forecasts in a small economy during crisis: A large Bayesian VAR approach

    Dimitris P. Louzis


    We examine the ability of large-scale vector autoregressions (VARs) to produce accurate macroeconomic (output and inflation) and credit (loans and lending rates) forecasts in Greece, during the latest sovereign debt crisis. We implement recently proposed Bayesian shrinkage techniques and we evaluate the information content of forty two (42) monthly macroeconomic and financial variables in a large Bayesian VAR context, using a five year out-of-sample forecasting period from 2008 to 2013. The e...

  17. No Customer Left Behind: A Distribution-Free Bayesian Approach to Accounting for Missing Xs in Marketing Models

    Yi Qian; Hui Xie


    In marketing applications, it is common that some key covariates in a regression model, such as marketing mix variables or consumer profiles, are subject to missingness. The convenient method that excludes the consumers with missingness in any covariate can result in a substantial loss of efficiency and may lead to strong selection bias in the estimation of consumer preferences and sensitivities. To solve these problems, we propose a new Bayesian distribution-free approach, which can ensure t...

  18. National evaluation of Chinese coastal erosion to sea level rise using a Bayesian approach

    In this paper a Causal Bayesian network is developed to predict decadal-scale shoreline evolution of China to sea-level rise. The Bayesian model defines relationships between 6 factors of Chinese coastal system such as coastal geomorphology, mean tide range, mean wave height, coastal slope, relative sea-level rise rate and shoreline erosion rate. Using the Bayesian probabilistic model, we make quantitative assessment of china's shoreline evolution in response to different future sea level rise rates. Results indicate that the probability of coastal erosion with high and very high rates increases from 28% to 32.3% when relative sea-level rise rates is 4∼6mm/a, and to 44.9% when relative sea-level rise rates is more than 6mm/a. A hindcast evaluation of the Bayesian model shows that the model correctly predicts 79.3% of the cases. Model test indicates that the Bayesian model shows higher predictive capabilities for stable coasts and very highly eroding coasts than moderately and highly eroding coasts. This study demonstrates that the Bayesian model is adapted to predicting decadal-scale Chinese coastal erosion associated with sea-level rise

  19. Coping with Trial-to-Trial Variability of Event Related Signals: A Bayesian Inference Approach

    Ding, Mingzhou; Chen, Youghong; Knuth, Kevin H.; Bressler, Steven L.; Schroeder, Charles E.


    In electro-neurophysiology, single-trial brain responses to a sensory stimulus or a motor act are commonly assumed to result from the linear superposition of a stereotypic event-related signal (e.g. the event-related potential or ERP) that is invariant across trials and some ongoing brain activity often referred to as noise. To extract the signal, one performs an ensemble average of the brain responses over many identical trials to attenuate the noise. To date, h s simple signal-plus-noise (SPN) model has been the dominant approach in cognitive neuroscience. Mounting empirical evidence has shown that the assumptions underlying this model may be overly simplistic. More realistic models have been proposed that account for the trial-to-trial variability of the event-related signal as well as the possibility of multiple differentially varying components within a given ERP waveform. The variable-signal-plus-noise (VSPN) model, which has been demonstrated to provide the foundation for separation and characterization of multiple differentially varying components, has the potential to provide a rich source of information for questions related to neural functions that complement the SPN model. Thus, being able to estimate the amplitude and latency of each ERP component on a trial-by-trial basis provides a critical link between the perceived benefits of the VSPN model and its many concrete applications. In this paper we describe a Bayesian approach to deal with this issue and the resulting strategy is referred to as the differentially Variable Component Analysis (dVCA). We compare the performance of dVCA on simulated data with Independent Component Analysis (ICA) and analyze neurobiological recordings from monkeys performing cognitive tasks.

  20. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals

    De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.


    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.

  1. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads


    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information

  2. A New Approach for Time Series Forecasting: Bayesian Enhanced by Fractional Brownian Motion with Application to Rainfall Series

    Cristian Rodriguez Rivero


    Full Text Available A new predictor algorithm based on Bayesian enhanced approach (BEA for long-term chaotic time series using artificial neural networks (ANN is presented. The technique based on stochastic models uses Bayesian inference by means of Fractional Brownian Motion as model data and Beta model as prior information. However, the need of experimental data for specifying and estimating causal models has not changed. Indeed, Bayes method provides another way to incorporate prior knowledge in forecasting models; the simplest representations of prior knowledge in forecasting models are hard to beat in many forecasting situations, either because prior knowledge is insufficient to improve on models or because prior knowledge leads to the conclusion that the situation is stable. This work contributes with long-term time series prediction, to give forecast horizons up to 18 steps ahead. Thus, the forecasted values and validation data are presented by solutions of benchmark chaotic series such as Mackey-Glass, Lorenz, Henon, Logistic, Rössler, Ikeda, Quadratic one-dimensional map series and monthly cumulative rainfall collected from Despeñaderos, Cordoba, Argentina. The computational results are evaluated against several non-linear ANN predictors proposed before on high roughness series that shows a better performance of Bayesian Enhanced approach in long-term forecasting.

  3. Exploring Neighborhood Influences on Small-Area Variations in Intimate Partner Violence Risk: A Bayesian Random-Effects Modeling Approach

    Enrique Gracia


    Full Text Available This paper uses spatial data of cases of intimate partner violence against women (IPVAW to examine neighborhood-level influences on small-area variations in IPVAW risk in a police district of the city of Valencia (Spain. To analyze area variations in IPVAW risk and its association with neighborhood-level explanatory variables we use a Bayesian spatial random-effects modeling approach, as well as disease mapping methods to represent risk probabilities in each area. Analyses show that IPVAW cases are more likely in areas of high immigrant concentration, high public disorder and crime, and high physical disorder. Results also show a spatial component indicating remaining variability attributable to spatially structured random effects. Bayesian spatial modeling offers a new perspective to identify IPVAW high and low risk areas, and provides a new avenue for the design of better-informed prevention and intervention strategies.

  4. A Bayesian Approach for Evaluation of Determinants of Health System Efficiency Using Stochastic Frontier Analysis and Beta Regression.

    Şenel, Talat; Cengiz, Mehmet Ali


    In today's world, Public expenditures on health are one of the most important issues for governments. These increased expenditures are putting pressure on public budgets. Therefore, health policy makers have focused on the performance of their health systems and many countries have introduced reforms to improve the performance of their health systems. This study investigates the most important determinants of healthcare efficiency for OECD countries using second stage approach for Bayesian Stochastic Frontier Analysis (BSFA). There are two steps in this study. First we measure 29 OECD countries' healthcare efficiency by BSFA using the data from the OECD Health Database. At second stage, we expose the multiple relationships between the healthcare efficiency and characteristics of healthcare systems across OECD countries using Bayesian beta regression. PMID:27118987

  5. A Bayesian kriging approach for blending satellite and ground precipitation observations

    Verdin, Andrew; Rajagopalan, Balaji; Kleiber, William; Funk, Christopher C.


    Drought and flood management practices require accurate estimates of precipitation. Gauge observations, however, are often sparse in regions with complicated terrain, clustered in valleys, and of poor quality. Consequently, the spatial extent of wet events is poorly represented. Satellite-derived precipitation data are an attractive alternative, though they tend to underestimate the magnitude of wet events due to their dependency on retrieval algorithms and the indirect relationship between satellite infrared observations and precipitation intensities. Here we offer a Bayesian kriging approach for blending precipitation gauge data and the Climate Hazards Group Infrared Precipitation satellite-derived precipitation estimates for Central America, Colombia, and Venezuela. First, the gauge observations are modeled as a linear function of satellite-derived estimates and any number of other variables—for this research we include elevation. Prior distributions are defined for all model parameters and the posterior distributions are obtained simultaneously via Markov chain Monte Carlo sampling. The posterior distributions of these parameters are required for spatial estimation, and thus are obtained prior to implementing the spatial kriging model. This functional framework is applied to model parameters obtained by sampling from the posterior distributions, and the residuals of the linear model are subject to a spatial kriging model. Consequently, the posterior distributions and uncertainties of the blended precipitation estimates are obtained. We demonstrate this method by applying it to pentadal and monthly total precipitation fields during 2009. The model's performance and its inherent ability to capture wet events are investigated. We show that this blending method significantly improves upon the satellite-derived estimates and is also competitive in its ability to represent wet events. This procedure also provides a means to estimate a full conditional distribution

  6. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping.

    Chris Wallace


    Full Text Available Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS and type 1 diabetes (T1D associations in the IL-2RA (CD25 gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3 and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data.

  7. A Bayesian Network Approach for Offshore Risk Analysis Through Linguistic Variables


    This paper presents a new approach for offshore risk analysis that is capable of dealing with linguistic probabilities in Bayesian networks (BNs). In this paper, linguistic probabilities are used to describe occurrence likelihood of hazardous events that may cause possible accidents in offshore operations. In order to use fuzzy information, an f-weighted valuation function is proposed to transform linguistic judgements into crisp probability distributions which can be easily put into a BN to model causal relationships among risk factors. The use of linguistic variables makes it easier for human experts to express their knowledge, and the transformation of linguistic judgements into crisp probabilities can significantly save the cost of computation, modifying and maintaining a BN model. The flexibility of the method allows for multiple forms of information to be used to quantify model relationships, including formally assessed expert opinion when quantitative data are lacking, or when only qualitative or vague statements can be made. The model is a modular representation of uncertain knowledge caused due to randomness, vagueness and ignorance. This makes the risk analysis of offshore engineering systems more functional and easier in many assessment contexts. Specifically, the proposed f-weighted valuation function takes into account not only the dominating values, but also the α-level values that are ignored by conventional valuation methods. A case study of the collision risk between a Floating Production, Storage and Off-loading (FPSO) unit and the authorised vessels due to human elements during operation is used to illustrate the application of the proposed model.

  8. Bayesian inference along Markov Chain Monte Carlo approach for PWR core loading pattern optimization

    Highlights: ► The BIMCMC method performs very well and is comparable to GA and PSO techniques. ► The potential of the technique is very well for optimization. ► It is observed that the performance of the method is quite adequate. ► The BIMCMC is very easy to implement. -- Abstract: Despite remarkable progress in optimization procedures, inherent complexities in nuclear reactor structure and strong interdependence among the fundamental indices namely, economic, neutronic, thermo-hydraulic and environmental effects make it necessary to evaluate the most efficient arrangement of a reactor core. In this paper a reactor core reloading technique based on Bayesian inference along Markov Chain Monte Carlo, BIMCMC, is addressed in the context of obtaining an optimal configuration of fuel assemblies in reactor cores. The Markov Chain Monte Carlo with Metropolis–Hastings algorithm has been applied for sampling variable and its acceptance. The proposed algorithm can be used for in-core fuel management optimization problems in pressurized water reactors. Considerable work has been expended for loading pattern optimization, but no preferred approach has yet emerged. To evaluate the proposed technique, increasing the effective multiplication factor Keff of a WWER-1000 core along flattening power with keeping power peaking factor below a specific limit as a first test case and flattening of power as a second test case are considered as objective functions; although other variables such as burn up and cycle length can also be taken into account. The results, convergence rate and reliability of the new method are compared to published data resulting from particle swarm optimization and genetic algorithm; the outcome is quite promising and demonstrating the potential of the technique very well for optimization applications in the nuclear engineering field.

  9. BClass: A Bayesian Approach Based on Mixture Models for Clustering and Classification of Heterogeneous Biological Data

    Arturo Medrano-Soto


    Full Text Available Based on mixture models, we present a Bayesian method (called BClass to classify biological entities (e.g. genes when variables of quite heterogeneous nature are analyzed. Various statistical distributions are used to model the continuous/categorical data commonly produced by genetic experiments and large-scale genomic projects. We calculate the posterior probability of each entry to belong to each element (group in the mixture. In this way, an original set of heterogeneous variables is transformed into a set of purely homogeneous characteristics represented by the probabilities of each entry to belong to the groups. The number of groups in the analysis is controlled dynamically by rendering the groups as 'alive' and 'dormant' depending upon the number of entities classified within them. Using standard Metropolis-Hastings and Gibbs sampling algorithms, we constructed a sampler to approximate posterior moments and grouping probabilities. Since this method does not require the definition of similarity measures, it is especially suitable for data mining and knowledge discovery in biological databases. We applied BClass to classify genes in RegulonDB, a database specialized in information about the transcriptional regulation of gene expression in the bacterium Escherichia coli. The classification obtained is consistent with current knowledge and allowed prediction of missing values for a number of genes. BClass is object-oriented and fully programmed in Lisp-Stat. The output grouping probabilities are analyzed and interpreted using graphical (dynamically linked plots and query-based approaches. We discuss the advantages of using Lisp-Stat as a programming language as well as the problems we faced when the data volume increased exponentially due to the ever-growing number of genomic projects.

  10. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Ross, G.


    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.