WorldWideScience

Sample records for bayesian multi-tissue approach

  1. Bayesian Approach for Inconsistent Information.

    Science.gov (United States)

    Stein, M; Beer, M; Kreinovich, V

    2013-10-01

    In engineering situations, we usually have a large amount of prior knowledge that needs to be taken into account when processing data. Traditionally, the Bayesian approach is used to process data in the presence of prior knowledge. Sometimes, when we apply the traditional Bayesian techniques to engineering data, we get inconsistencies between the data and prior knowledge. These inconsistencies are usually caused by the fact that in the traditional approach, we assume that we know the exact sample values, that the prior distribution is exactly known, etc. In reality, the data is imprecise due to measurement errors, the prior knowledge is only approximately known, etc. So, a natural way to deal with the seemingly inconsistent information is to take this imprecision into account in the Bayesian approach - e.g., by using fuzzy techniques. In this paper, we describe several possible scenarios for fuzzifying the Bayesian approach. Particular attention is paid to the interaction between the estimated imprecise parameters. In this paper, to implement the corresponding fuzzy versions of the Bayesian formulas, we use straightforward computations of the related expression - which makes our computations reasonably time-consuming. Computations in the traditional (non-fuzzy) Bayesian approach are much faster - because they use algorithmically efficient reformulations of the Bayesian formulas. We expect that similar reformulations of the fuzzy Bayesian formulas will also drastically decrease the computation time and thus, enhance the practical use of the proposed methods.

  2. Bayesian approach to rough set

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.

  3. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;

    2011-01-01

    techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing......We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....

  4. A Bayesian approach to person perception.

    Science.gov (United States)

    Clifford, C W G; Mareschal, I; Otsuka, Y; Watson, T L

    2015-11-01

    Here we propose a Bayesian approach to person perception, outlining the theoretical position and a methodological framework for testing the predictions experimentally. We use the term person perception to refer not only to the perception of others' personal attributes such as age and sex but also to the perception of social signals such as direction of gaze and emotional expression. The Bayesian approach provides a formal description of the way in which our perception combines current sensory evidence with prior expectations about the structure of the environment. Such expectations can lead to unconscious biases in our perception that are particularly evident when sensory evidence is uncertain. We illustrate the ideas with reference to our recent studies on gaze perception which show that people have a bias to perceive the gaze of others as directed towards themselves. We also describe a potential application to the study of the perception of a person's sex, in which a bias towards perceiving males is typically observed.

  5. Particle identification in ALICE: a Bayesian approach

    CERN Document Server

    Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Vargas Trevino, Aurora Diozcora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...

  6. Bayesian approach to avoiding track seduction

    Science.gov (United States)

    Salmond, David J.; Everett, Nicholas O.

    2002-08-01

    The problem of maintaining track on a primary target in the presence spurious objects is addressed. Recursive and batch filtering approaches are developed. For the recursive approach, a Bayesian track splitting filter is derived which spawns candidate tracks if there is a possibility of measurement misassociation. The filter evaluates the probability of each candidate track being associated with the primary target. The batch filter is a Markov-chain Monte Carlo (MCMC) algorithm which fits the observed data sequence to models of target dynamics and measurement-track association. Simulation results are presented.

  7. Radioactive Contraband Detection: A Bayesian Approach

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Sale, K; Chambers, D; Axelrod, M; Meyer, A

    2009-03-16

    Radionuclide emissions from nuclear contraband challenge both detection and measurement technologies to capture and record each event. The development of a sequential Bayesian processor incorporating both the physics of gamma-ray emissions and the measurement of photon energies offers a physics-based approach to attack this challenging problem. It is shown that a 'physics-based' structure can be used to develop an effective detection technique, but also motivates the implementation of this approach using or particle filters to enhance and extract the required information.

  8. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  9. A Bayesian Approach to Learning Scoring Systems.

    Science.gov (United States)

    Ertekin, Şeyda; Rudin, Cynthia

    2015-12-01

    We present a Bayesian method for building scoring systems, which are linear models with coefficients that have very few significant digits. Usually the construction of scoring systems involve manual effort-humans invent the full scoring system without using data, or they choose how logistic regression coefficients should be scaled and rounded to produce a scoring system. These kinds of heuristics lead to suboptimal solutions. Our approach is different in that humans need only specify the prior over what the coefficients should look like, and the scoring system is learned from data. For this approach, we provide a Metropolis-Hastings sampler that tends to pull the coefficient values toward their "natural scale." Empirically, the proposed method achieves a high degree of interpretability of the models while maintaining competitive generalization performances.

  10. Modeling Social Annotation: a Bayesian Approach

    CERN Document Server

    Plangprasopchok, Anon

    2008-01-01

    Collaborative tagging systems, such as del.icio.us, CiteULike, and others, allow users to annotate objects, e.g., Web pages or scientific papers, with descriptive labels called tags. The social annotations, contributed by thousands of users, can potentially be used to infer categorical knowledge, classify documents or recommend new relevant information. Traditional text inference methods do not make best use of socially-generated data, since they do not take into account variations in individual users' perspectives and vocabulary. In a previous work, we introduced a simple probabilistic model that takes interests of individual annotators into account in order to find hidden topics of annotated objects. Unfortunately, our proposed approach had a number of shortcomings, including overfitting, local maxima and the requirement to specify values for some parameters. In this paper we address these shortcomings in two ways. First, we extend the model to a fully Bayesian framework. Second, we describe an infinite ver...

  11. A Bayesian Approach to Network Modularity

    CERN Document Server

    Hofman, Jake M

    2007-01-01

    We present an efficient, principled, and interpretable technique for inferring module assignments and identifying the optimal number of modules in a given network. We show how several existing methods for finding modules can be described as variant, special, or limiting cases of our work, and how related methods for complexity control -- identification of the true number of modules -- are outperformed. Our approach is based on Bayesian methods for model selection which have been used with success for almost a century, implemented using a variational technique developed only in the past decade. We apply the technique to synthetic and real networks, including networks of up to $10^4$ nodes, and outline how the method naturally allows model selection among generative models.

  12. A Bayesian Shrinkage Approach for AMMI Models.

    Science.gov (United States)

    da Silva, Carlos Pereira; de Oliveira, Luciano Antonio; Nuvunga, Joel Jorge; Pamplona, Andrezza Kéllen Alves; Balestre, Marcio

    2015-01-01

    Linear-bilinear models, especially the additive main effects and multiplicative interaction (AMMI) model, are widely applicable to genotype-by-environment interaction (GEI) studies in plant breeding programs. These models allow a parsimonious modeling of GE interactions, retaining a small number of principal components in the analysis. However, one aspect of the AMMI model that is still debated is the selection criteria for determining the number of multiplicative terms required to describe the GE interaction pattern. Shrinkage estimators have been proposed as selection criteria for the GE interaction components. In this study, a Bayesian approach was combined with the AMMI model with shrinkage estimators for the principal components. A total of 55 maize genotypes were evaluated in nine different environments using a complete blocks design with three replicates. The results show that the traditional Bayesian AMMI model produces low shrinkage of singular values but avoids the usual pitfalls in determining the credible intervals in the biplot. On the other hand, Bayesian shrinkage AMMI models have difficulty with the credible interval for model parameters, but produce stronger shrinkage of the principal components, converging to GE matrices that have more shrinkage than those obtained using mixed models. This characteristic allowed more parsimonious models to be chosen, and resulted in models being selected that were similar to those obtained by the Cornelius F-test (α = 0.05) in traditional AMMI models and cross validation based on leave-one-out. This characteristic allowed more parsimonious models to be chosen and more GEI pattern retained on the first two components. The resulting model chosen by posterior distribution of singular value was also similar to those produced by the cross-validation approach in traditional AMMI models. Our method enables the estimation of credible interval for AMMI biplot plus the choice of AMMI model based on direct posterior

  13. A Bayesian Shrinkage Approach for AMMI Models.

    Directory of Open Access Journals (Sweden)

    Carlos Pereira da Silva

    Full Text Available Linear-bilinear models, especially the additive main effects and multiplicative interaction (AMMI model, are widely applicable to genotype-by-environment interaction (GEI studies in plant breeding programs. These models allow a parsimonious modeling of GE interactions, retaining a small number of principal components in the analysis. However, one aspect of the AMMI model that is still debated is the selection criteria for determining the number of multiplicative terms required to describe the GE interaction pattern. Shrinkage estimators have been proposed as selection criteria for the GE interaction components. In this study, a Bayesian approach was combined with the AMMI model with shrinkage estimators for the principal components. A total of 55 maize genotypes were evaluated in nine different environments using a complete blocks design with three replicates. The results show that the traditional Bayesian AMMI model produces low shrinkage of singular values but avoids the usual pitfalls in determining the credible intervals in the biplot. On the other hand, Bayesian shrinkage AMMI models have difficulty with the credible interval for model parameters, but produce stronger shrinkage of the principal components, converging to GE matrices that have more shrinkage than those obtained using mixed models. This characteristic allowed more parsimonious models to be chosen, and resulted in models being selected that were similar to those obtained by the Cornelius F-test (α = 0.05 in traditional AMMI models and cross validation based on leave-one-out. This characteristic allowed more parsimonious models to be chosen and more GEI pattern retained on the first two components. The resulting model chosen by posterior distribution of singular value was also similar to those produced by the cross-validation approach in traditional AMMI models. Our method enables the estimation of credible interval for AMMI biplot plus the choice of AMMI model based on direct

  14. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  15. Merging Digital Surface Models Implementing Bayesian Approaches

    Science.gov (United States)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  16. The Bayesian Revolution Approaches Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2007-01-01

    This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…

  17. A new approach for Bayesian model averaging

    Institute of Scientific and Technical Information of China (English)

    TIAN XiangJun; XIE ZhengHui; WANG AiHui; YANG XiaoChun

    2012-01-01

    Bayesian model averaging (BMA) is a recently proposed statistical method for calibrating forecast ensembles from numerical weather models.However,successful implementation of BMA requires accurate estimates of the weights and variances of the individual competing models in the ensemble.Two methods,namely the Expectation-Maximization (EM) and the Markov Chain Monte Carlo (MCMC) algorithms,are widely used for BMA model training.Both methods have their own respective strengths and weaknesses.In this paper,we first modify the BMA log-likelihood function with the aim of removing the additional limitation that requires that the BMA weights add to one,and then use a limited memory quasi-Newtonian algorithm for solving the nonlinear optimization problem,thereby formulating a new approach for BMA (referred to as BMA-BFGS).Several groups of multi-model soil moisture simulation experiments from three land surface models show that the performance of BMA-BFGS is similar to the MCMC method in terms of simulation accuracy,and that both are superior to the EM algorithm.On the other hand,the computational cost of the BMA-BFGS algorithm is substantially less than for MCMC and is almost equivalent to that for EM.

  18. Bayesian approach to decompression sickness model parameter estimation.

    Science.gov (United States)

    Howle, L E; Weber, P W; Nichols, J M

    2017-03-01

    We examine both maximum likelihood and Bayesian approaches for estimating probabilistic decompression sickness model parameters. Maximum likelihood estimation treats parameters as fixed values and determines the best estimate through repeated trials, whereas the Bayesian approach treats parameters as random variables and determines the parameter probability distributions. We would ultimately like to know the probability that a parameter lies in a certain range rather than simply make statements about the repeatability of our estimator. Although both represent powerful methods of inference, for models with complex or multi-peaked likelihoods, maximum likelihood parameter estimates can prove more difficult to interpret than the estimates of the parameter distributions provided by the Bayesian approach. For models of decompression sickness, we show that while these two estimation methods are complementary, the credible intervals generated by the Bayesian approach are more naturally suited to quantifying uncertainty in the model parameters.

  19. On an Approach to Bayesian Sample Sizing in Clinical Trials

    CERN Document Server

    Muirhead, Robb J

    2012-01-01

    This paper explores an approach to Bayesian sample size determination in clinical trials. The approach falls into the category of what is often called "proper Bayesian", in that it does not mix frequentist concepts with Bayesian ones. A criterion for a "successful trial" is defined in terms of a posterior probability, its probability is assessed using the marginal distribution of the data, and this probability forms the basis for choosing sample sizes. We illustrate with a standard problem in clinical trials, that of establishing superiority of a new drug over a control.

  20. A Bayesian approach to particle identification in ALICE

    CERN Document Server

    CERN. Geneva

    2016-01-01

    Among the LHC experiments, ALICE has unique particle identification (PID) capabilities exploiting different types of detectors. During Run 1, a Bayesian approach to PID was developed and intensively tested. It facilitates the combination of information from different sub-systems. The adopted methodology and formalism as well as the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE will be reviewed. Results are presented with PID performed via measurements of specific energy loss (dE/dx) and time-of-flight using information from the TPC and TOF detectors, respectively. Methods to extract priors from data and to compare PID efficiencies and misidentification probabilities in data and Monte Carlo using high-purity samples of identified particles will be presented. Bayesian PID results were found consistent with previous measurements published by ALICE. The Bayesian PID approach gives a higher signal-to-background ratio and a similar or larger statist...

  1. Group sequential control of overall toxicity incidents in clinical trials - non-Bayesian and Bayesian approaches.

    Science.gov (United States)

    Yu, Jihnhee; Hutson, Alan D; Siddiqui, Adnan H; Kedron, Mary A

    2016-02-01

    In some small clinical trials, toxicity is not a primary endpoint; however, it often has dire effects on patients' quality of life and is even life-threatening. For such clinical trials, rigorous control of the overall incidence of adverse events is desirable, while simultaneously collecting safety information. In this article, we propose group sequential toxicity monitoring strategies to control overall toxicity incidents below a certain level as opposed to performing hypothesis testing, which can be incorporated into an existing study design based on the primary endpoint. We consider two sequential methods: a non-Bayesian approach in which stopping rules are obtained based on the 'future' probability of an excessive toxicity rate; and a Bayesian adaptation modifying the proposed non-Bayesian approach, which can use the information obtained at interim analyses. Through an extensive Monte Carlo study, we show that the Bayesian approach often provides better control of the overall toxicity rate than the non-Bayesian approach. We also investigate adequate toxicity estimation after the studies. We demonstrate the applicability of our proposed methods in controlling the symptomatic intracranial hemorrhage rate for treating acute ischemic stroke patients.

  2. Bayesian network approach to spatial data mining: a case study

    Science.gov (United States)

    Huang, Jiejun; Wan, Youchuan

    2006-10-01

    Spatial data mining is a process of discovering interesting, novel, and potentially useful information or knowledge hidden in spatial data sets. It involves different techniques and different methods from various areas of research. A Bayesian network is a graphical model that encodes causal probabilistic relationships among variables of interest, which has a powerful ability for representing and reasoning and provides an effective way to spatial data mining. In this paper we give an introduction to Bayesian networks, and discuss using Bayesian networks for spatial data mining. We propose a framework of spatial data mining based on Bayesian networks. Then we show a case study and use the experimental results to validate the practical viability of the proposed approach to spatial data mining. Finally, the paper gives a summary and some remarks.

  3. Bayesian approach to inverse statistical mechanics.

    Science.gov (United States)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  4. Analysis of COSIMA spectra: Bayesian approach

    Directory of Open Access Journals (Sweden)

    H. J. Lehto

    2014-11-01

    Full Text Available We describe the use of Bayesian analysis methods applied to TOF-SIMS spectra. The method finds the probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes, positions in mass intervals over the whole spectrum. We discuss the results we can expect from this analysis. We discuss the effects the instrument dead time causes in the COSIMA TOF SIMS. We address this issue in a new way. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method in two ways, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique for COSIMA. Finally, we point out that the Bayesian method can be thought as a means to solve inverse problems but with forward calculations only.

  5. A Bayesian Approach for Image Segmentation with Shape Priors

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hang; Yang, Qing; Parvin, Bahram

    2008-06-20

    Color and texture have been widely used in image segmentation; however, their performance is often hindered by scene ambiguities, overlapping objects, or missingparts. In this paper, we propose an interactive image segmentation approach with shape prior models within a Bayesian framework. Interactive features, through mouse strokes, reduce ambiguities, and the incorporation of shape priors enhances quality of the segmentation where color and/or texture are not solely adequate. The novelties of our approach are in (i) formulating the segmentation problem in a well-de?ned Bayesian framework with multiple shape priors, (ii) ef?ciently estimating parameters of the Bayesian model, and (iii) multi-object segmentation through user-speci?ed priors. We demonstrate the effectiveness of our method on a set of natural and synthetic images.

  6. Sequential Bayesian technique: An alternative approach for software reliability estimation

    Indian Academy of Sciences (India)

    S Chatterjee; S S Alam; R B Misra

    2009-04-01

    This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become available. The usefulness of the method is demonstrated with some real life data

  7. A Bayesian Approach for Analyzing Longitudinal Structural Equation Models

    Science.gov (United States)

    Song, Xin-Yuan; Lu, Zhao-Hua; Hser, Yih-Ing; Lee, Sik-Yum

    2011-01-01

    This article considers a Bayesian approach for analyzing a longitudinal 2-level nonlinear structural equation model with covariates, and mixed continuous and ordered categorical variables. The first-level model is formulated for measures taken at each time point nested within individuals for investigating their characteristics that are dynamically…

  8. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Science.gov (United States)

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  9. Remotely sensed monitoring of small reservoir dynamics: a Bayesian approach

    NARCIS (Netherlands)

    Eilander, D.M.; Annor, F.O.; Iannini, L.; Van de Giesen, N.C.

    2014-01-01

    Multipurpose small reservoirs are important for livelihoods in rural semi-arid regions. To manage and plan these reservoirs and to assess their hydrological impact at a river basin scale, it is important to monitor their water storage dynamics. This paper introduces a Bayesian approach for monitorin

  10. A Bayesian approach to combining animal abundance and demographic data

    Directory of Open Access Journals (Sweden)

    Brooks, S. P.

    2004-06-01

    Full Text Available In studies of wild animals, one frequently encounters both count and mark-recapture-recovery data. Here, we consider an integrated Bayesian analysis of ring¿recovery and count data using a state-space model. We then impose a Leslie-matrix-based model on the true population counts describing the natural birth-death and age transition processes. We focus upon the analysis of both count and recovery data collected on British lapwings (Vanellus vanellus combined with records of the number of frost days each winter. We demonstrate how the combined analysis of these data provides a more robust inferential framework and discuss how the Bayesian approach using MCMC allows us to remove the potentially restrictive normality assumptions commonly assumed for analyses of this sort. It is shown how WinBUGS may be used to perform the Bayesian analysis. WinBUGS code is provided and its performance is critically discussed.

  11. The subjectivity of scientists and the Bayesian approach

    CERN Document Server

    Press, James S

    2016-01-01

    "Press and Tanur argue that subjectivity has not only played a significant role in the advancement of science but that science will advance more rapidly if the modern methods of Bayesian statistical analysis replace some of the more classical twentieth-century methods." — SciTech Book News. "An insightful work." ― Choice. "Compilation of interesting popular problems … this book is fascinating." — Short Book Reviews, International Statistical Institute. Subjectivity ― including intuition, hunches, and personal beliefs ― has played a key role in scientific discovery. This intriguing book illustrates subjective influences on scientific progress with historical accounts and biographical sketches of more than a dozen luminaries, including Aristotle, Galileo, Newton, Darwin, Pasteur, Freud, Einstein, Margaret Mead, and others. The treatment also offers a detailed examination of the modern Bayesian approach to data analysis, with references to the Bayesian theoretical and applied literature. Suitable for...

  12. Analysis of COSIMA spectra: Bayesian approach

    Directory of Open Access Journals (Sweden)

    H. J. Lehto

    2015-06-01

    secondary ion mass spectrometer (TOF-SIMS spectra. The method is applied to the COmetary Secondary Ion Mass Analyzer (COSIMA TOF-SIMS mass spectra where the analysis can be broken into subgroups of lines close to integer mass values. The effects of the instrumental dead time are discussed in a new way. The method finds the joint probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes and positions. In the case of two or more lines, these distributions can take complex forms. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique to COSIMA. Finally, we point out that the Bayesian method can be thought of as a means to solve inverse problems but with forward calculations, only with no iterative corrections or other manipulation of the observed data.

  13. A Bayesian approach to simultaneously quantify assignments and linguistic uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC FREDERICKSBURG; Ross, Timothy J [UNM

    2010-10-07

    Subject matter expert assessments can include both assignment and linguistic uncertainty. This paper examines assessments containing linguistic uncertainty associated with a qualitative description of a specific state of interest and the assignment uncertainty associated with assigning a qualitative value to that state. A Bayesian approach is examined to simultaneously quantify both assignment and linguistic uncertainty in the posterior probability. The approach is applied to a simplified damage assessment model involving both assignment and linguistic uncertainty. The utility of the approach and the conditions under which the approach is feasible are examined and identified.

  14. Dichroic polarization at mid-infrared wavelengths: a Bayesian approach

    CERN Document Server

    Lopez-Rodriguez, E

    2015-01-01

    A fast and general Bayesian inference framework to infer the physical properties of dichroic polarization using mid-infrared imaging- and spectro-polarimetric observations is presented. The Bayesian approach is based on a hierarchical regression and No-U-Turn Sampler method. This approach simultaneously infers the normalized Stokes parameters to find the full family of solutions that best describe the observations. In comparison with previous methods, the developed Bayesian approach allows the user to introduce a customized absorptive polarization component based on the dust composition, and the appropriate extinction curve of the object. This approach allows the user to obtain more precise estimations of the magnetic field strength and geometry for tomographic studies, and information about the dominant polarization components of the object. Based on this model, imaging-polarimetric observations using two or three filters located in the central 9.5-10.5 $\\mu$m, and the edges 8-9 $\\mu$m and/or 11-13 $\\mu$m, o...

  15. Approach to the Correlation Discovery of Chinese Linguistic Parameters Based on Bayesian Method

    Institute of Scientific and Technical Information of China (English)

    WANG Wei(王玮); CAI LianHong(蔡莲红)

    2003-01-01

    Bayesian approach is an important method in statistics. The Bayesian belief network is a powerful knowledge representation and reasoning tool under the conditions of uncertainty.It is a graphics model that encodes probabilistic relationships among variables of interest. In this paper, an approach to Bayesian network construction is given for discovering the Chinese linguistic parameter relationship in the corpus.

  16. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  17. A Bayesian sequential processor approach to spectroscopic portal system decisions

    Energy Technology Data Exchange (ETDEWEB)

    Sale, K; Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Gosnell, T; Chambers, D

    2007-07-31

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waiting for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.

  18. A Bayesian approach to optimizing cryopreservation protocols

    Directory of Open Access Journals (Sweden)

    Sammy Sambu

    2015-06-01

    Full Text Available Cryopreservation is beset with the challenge of protocol alignment across a wide range of cell types and process variables. By taking a cross-sectional assessment of previously published cryopreservation data (sample means and standard errors as preliminary meta-data, a decision tree learning analysis (DTLA was performed to develop an understanding of target survival using optimized pruning methods based on different approaches. Briefly, a clear direction on the decision process for selection of methods was developed with key choices being the cooling rate, plunge temperature on the one hand and biomaterial choice, use of composites (sugars and proteins as additional constituents, loading procedure and cell location in 3D scaffolding on the other. Secondly, using machine learning and generalized approaches via the Naïve Bayes Classification (NBC method, these metadata were used to develop posterior probabilities for combinatorial approaches that were implicitly recorded in the metadata. These latter results showed that newer protocol choices developed using probability elicitation techniques can unearth improved protocols consistent with multiple unidimensionally-optimized physical protocols. In conclusion, this article proposes the use of DTLA models and subsequently NBC for the improvement of modern cryopreservation techniques through an integrative approach.

  19. Bayesian Approach to the Best Estimate of the Hubble Constant

    Institute of Scientific and Technical Information of China (English)

    王晓峰; 陈黎; 李宗伟

    2001-01-01

    A Bayesian approach is used to derive the probability distribution (PD) of the Hubble constant H0 from recent measurements including supernovae Ia, the Tully-Fisher relation, population Ⅱ and physical methods. The discrepancies among these PDs are briefly discussed. The combined value of all the measurements is obtained,with a 95% confidence interval of 58.7 < Ho < 67.3 (km·s-1.Mpc-1).

  20. A bayesian approach to laboratory utilization management

    Directory of Open Access Journals (Sweden)

    Ronald G Hauser

    2015-01-01

    Full Text Available Background: Laboratory utilization management describes a process designed to increase healthcare value by altering requests for laboratory services. A typical approach to monitor and prioritize interventions involves audits of laboratory orders against specific criteria, defined as rule-based laboratory utilization management. This approach has inherent limitations. First, rules are inflexible. They adapt poorly to the ambiguity of medical decision-making. Second, rules judge the context of a decision instead of the patient outcome allowing an order to simultaneously save a life and break a rule. Third, rules can threaten physician autonomy when used in a performance evaluation. Methods: We developed an alternative to rule-based laboratory utilization. The core idea comes from a formula used in epidemiology to estimate disease prevalence. The equation relates four terms: the prevalence of disease, the proportion of positive tests, test sensitivity and test specificity. When applied to a laboratory utilization audit, the formula estimates the prevalence of disease (pretest probability [PTP] in the patients tested. The comparison of PTPs among different providers, provider groups, or patient cohorts produces an objective evaluation of laboratory requests. We demonstrate the model in a review of tests for enterovirus (EV meningitis. Results: The model identified subpopulations within the cohort with a low prevalence of disease. These low prevalence groups shared demographic and seasonal factors known to protect against EV meningitis. This suggests too many orders occurred from patients at low risk for EV. Conclusion: We introduce a new method for laboratory utilization management programs to audit laboratory services.

  1. Overlapping community detection in weighted networks via a Bayesian approach

    Science.gov (United States)

    Chen, Yi; Wang, Xiaolong; Xiang, Xin; Tang, Buzhou; Chen, Qingcai; Fan, Shixi; Bu, Junzhao

    2017-02-01

    Complex networks as a powerful way to represent complex systems have been widely studied during the past several years. One of the most important tasks of complex network analysis is to detect communities embedded in networks. In the real world, weighted networks are very common and may contain overlapping communities where a node is allowed to belong to multiple communities. In this paper, we propose a novel Bayesian approach, called the Bayesian mixture network (BMN) model, to detect overlapping communities in weighted networks. The advantages of our method are (i) providing soft-partition solutions in weighted networks; (ii) providing soft memberships, which quantify 'how strongly' a node belongs to a community. Experiments on a large number of real and synthetic networks show that our model has the ability in detecting overlapping communities in weighted networks and is competitive with other state-of-the-art models at shedding light on community partition.

  2. Remotely Sensed Monitoring of Small Reservoir Dynamics: A Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Dirk Eilander

    2014-01-01

    Full Text Available Multipurpose small reservoirs are important for livelihoods in rural semi-arid regions. To manage and plan these reservoirs and to assess their hydrological impact at a river basin scale, it is important to monitor their water storage dynamics. This paper introduces a Bayesian approach for monitoring small reservoirs with radar satellite images. The newly developed growing Bayesian classifier has a high degree of automation, can readily be extended with auxiliary information and reduces the confusion error to the land-water boundary pixels. A case study has been performed in the Upper East Region of Ghana, based on Radarsat-2 data from November 2012 until April 2013. Results show that the growing Bayesian classifier can deal with the spatial and temporal variability in synthetic aperture radar (SAR backscatter intensities from small reservoirs. Due to its ability to incorporate auxiliary information, the algorithm is able to delineate open water from SAR imagery with a low land-water contrast in the case of wind-induced Bragg scattering or limited vegetation on the land surrounding a small reservoir.

  3. A Bayesian approach to mitigation of publication bias.

    Science.gov (United States)

    Guan, Maime; Vandekerckhove, Joachim

    2016-02-01

    The reliability of published research findings in psychology has been a topic of rising concern. Publication bias, or treating positive findings differently from negative findings, is a contributing factor to this "crisis of confidence," in that it likely inflates the number of false-positive effects in the literature. We demonstrate a Bayesian model averaging approach that takes into account the possibility of publication bias and allows for a better estimate of true underlying effect size. Accounting for the possibility of bias leads to a more conservative interpretation of published studies as well as meta-analyses. We provide mathematical details of the method and examples.

  4. Bayesian ensemble approach to error estimation of interatomic potentials

    DEFF Research Database (Denmark)

    Frederiksen, Søren Lund; Jacobsen, Karsten Wedel; Brown, K.S.;

    2004-01-01

    Using a Bayesian approach a general method is developed to assess error bars on predictions made by models fitted to data. The error bars are estimated from fluctuations in ensembles of models sampling the model-parameter space with a probability density set by the minimum cost. The method...... is applied to the development of interatomic potentials for molybdenum using various potential forms and databases based on atomic forces. The calculated error bars on elastic constants, gamma-surface energies, structural energies, and dislocation properties are shown to provide realistic estimates...... of the actual errors for the potentials....

  5. A Bayesian experimental design approach to structural health monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Flynn, Eric [UCSD; Todd, Michael [UCSD

    2010-01-01

    Optimal system design for SHM involves two primarily challenges. The first is the derivation of a proper performance function for a given system design. The second is the development of an efficient optimization algorithm for choosing a design that maximizes, or nearly maximizes the performance function. In this paper we will outline how an SHM practitioner can construct the proper performance function by casting the entire design problem into a framework of Bayesian experimental design. The approach demonstrates how the design problem necessarily ties together all steps of the SHM process.

  6. The subjectivity of scientists and the Bayesian statistical approach

    CERN Document Server

    Press, James S

    2001-01-01

    Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a

  7. A Bayesian Model Committee Approach to Forecasting Global Solar Radiation

    CERN Document Server

    Lauret, Philippe; Muselli, Marc; David, Mathieu; Diagne, Hadja; Voyant, Cyril

    2012-01-01

    This paper proposes to use a rather new modelling approach in the realm of solar radiation forecasting. In this work, two forecasting models: Autoregressive Moving Average (ARMA) and Neural Network (NN) models are combined to form a model committee. The Bayesian inference is used to affect a probability to each model in the committee. Hence, each model's predictions are weighted by their respective probability. The models are fitted to one year of hourly Global Horizontal Irradiance (GHI) measurements. Another year (the test set) is used for making genuine one hour ahead (h+1) out-of-sample forecast comparisons. The proposed approach is benchmarked against the persistence model. The very first results show an improvement brought by this approach.

  8. A Bayesian Sampling Approach to Exploration in Reinforcement Learning

    CERN Document Server

    Asmuth, John; Littman, Michael L; Nouri, Ali; Wingate, David

    2012-01-01

    We present a modular approach to reinforcement learning that uses a Bayesian representation of the uncertainty over models. The approach, BOSS (Best of Sampled Set), drives exploration by sampling multiple models from the posterior and selecting actions optimistically. It extends previous work by providing a rule for deciding when to resample and how to combine the models. We show that our algorithm achieves nearoptimal reward with high probability with a sample complexity that is low relative to the speed at which the posterior distribution converges during learning. We demonstrate that BOSS performs quite favorably compared to state-of-the-art reinforcement-learning approaches and illustrate its flexibility by pairing it with a non-parametric model that generalizes across states.

  9. Bayesian Approach for Reliability Assessment of Sunshield Deployment on JWST

    Science.gov (United States)

    Kaminskiy, Mark P.; Evans, John W.; Gallo, Luis D.

    2013-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.

  10. Generalized linear models with coarsened covariates: a practical Bayesian approach.

    Science.gov (United States)

    Johnson, Timothy R; Wiest, Michelle M

    2014-06-01

    Coarsened covariates are a common and sometimes unavoidable phenomenon encountered in statistical modeling. Covariates are coarsened when their values or categories have been grouped. This may be done to protect privacy or to simplify data collection or analysis when researchers are not aware of their drawbacks. Analyses with coarsened covariates based on ad hoc methods can compromise the validity of inferences. One valid method for accounting for a coarsened covariate is to use a marginal likelihood derived by summing or integrating over the unknown realizations of the covariate. However, algorithms for estimation based on this approach can be tedious to program and can be computationally expensive. These are significant obstacles to their use in practice. To overcome these limitations, we show that when expressed as a Bayesian probability model, a generalized linear model with a coarsened covariate can be posed as a tractable missing data problem where the missing data are due to censoring. We also show that this model is amenable to widely available general-purpose software for simulation-based inference for Bayesian probability models, providing researchers a very practical approach for dealing with coarsened covariates.

  11. Robust adaptive beamforming algorithm based on Bayesian approach

    Institute of Scientific and Technical Information of China (English)

    Xin SONG; Jinkuan WANG; Yinghua HAN; Han WANG

    2008-01-01

    The performance of adaptive array beamform-ing algorithms substantially degrades in practice because of a slight mismatch between actual and presumed array res-ponses to the desired signal. A novel robust adaptive beam-forming algorithm based on Bayesian approach is therefore proposed. The algorithm responds to the current envi-ronment by estimating the direction of arrival (DOA) of the actual signal from observations. Computational com-plexity of the proposed algorithm can thus be reduced com-pared with other algorithms since the recursive method is used to obtain inverse matrix. In addition, it has strong robustness to the uncertainty of actual signal DOA and makes the mean output array signal-to-interference-plus-noise ratio (SINR) consistently approach the optimum. Simulation results show that the proposed algorithm is bet-ter in performance than conventional adaptive beamform-ing algorithms.

  12. A Bayesian approach to traffic light detection and mapping

    Science.gov (United States)

    Hosseinyalamdary, Siavash; Yilmaz, Alper

    2017-03-01

    Automatic traffic light detection and mapping is an open research problem. The traffic lights vary in color, shape, geolocation, activation pattern, and installation which complicate their automated detection. In addition, the image of the traffic lights may be noisy, overexposed, underexposed, or occluded. In order to address this problem, we propose a Bayesian inference framework to detect and map traffic lights. In addition to the spatio-temporal consistency constraint, traffic light characteristics such as color, shape and height is shown to further improve the accuracy of the proposed approach. The proposed approach has been evaluated on two benchmark datasets and has been shown to outperform earlier studies. The results show that the precision and recall rates for the KITTI benchmark are 95.78 % and 92.95 % respectively and the precision and recall rates for the LARA benchmark are 98.66 % and 94.65 % .

  13. Bayesian approach for near-duplicate image detection

    CERN Document Server

    Bueno, Lucas Moutinho; Torres, Ricardo da Silva

    2011-01-01

    In this paper we propose a bayesian approach for near-duplicate image detection, and investigate how different probabilistic models affect the performance obtained. The task of identifying an image whose metadata are missing is often demanded for a myriad of applications: metadata retrieval in cultural institutions, detection of copyright violations, investigation of latent cross-links in archives and libraries, duplicate elimination in storage management, etc. The majority of current solutions are based either on voting algorithms, which are very precise, but expensive; either on the use of visual dictionaries, which are efficient, but less precise. Our approach, uses local descriptors in a novel way, which by a careful application of decision theory, allows a very fine control of the compromise between precision and efficiency. In addition, the method attains a great compromise between those two axes, with more than 99% accuracy with less than 10 database operations.

  14. A Bayesian Approach for Sensor Optimisation in Impact Identification

    Directory of Open Access Journals (Sweden)

    Vincenzo Mallardo

    2016-11-01

    Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  15. Detecting Threat E-mails using Bayesian Approach

    CERN Document Server

    Banday, M Tariq; Jan, Tariq R; Shah, Nisar A

    2011-01-01

    Fraud and terrorism have a close connect in terms of the processes that enables and promote them. In the era of Internet, its various services that include Web, e-mail, social networks, blogs, instant messaging, chats, etc. are used in terrorism not only for communication but also for i) creation of ideology, ii) resource gathering, iii) recruitment, indoctrination and training, iv) creation of terror network, and v) information gathering. A major challenge for law enforcement and intelligence agencies is efficient and accurate gathering of relevant and growing volume of crime data. This paper reports on use of established Na\\"ive Bayesian filter for classification of threat e-mails. Efficiency in filtering threat e-mail by use of three different Na\\"ive Bayesian filter approaches i.e. single keywords, weighted multiple keywords and weighted multiple keywords with keyword context matching are evaluated on a threat e-mail corpus created by extracting data from sources that are very close to terrorism.

  16. A Bayesian Approach to Real-Time Earthquake Phase Association

    Science.gov (United States)

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  17. A Robust Obstacle Avoidance for Service Robot Using Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Widodo Budiharto

    2011-03-01

    Full Text Available The objective of this paper is to propose a robust obstacle avoidance method for service robot in indoor environment. The method for obstacles avoidance uses information about static obstacles on the landmark using edge detection. Speed and direction of people that walks as moving obstacle obtained by single camera using tracking and recognition system and distance measurement using 3 ultrasonic sensors. A new geometrical model and maneuvering method for moving obstacle avoidance introduced and combined with Bayesian approach for state estimation. The obstacle avoidance problem is formulated using decision theory, prior and posterior distribution and loss function to determine an optimal response based on inaccurate sensor data. Algorithms for moving obstacles avoidance method proposed and experiment results implemented to service robot also presented. Various experiments show that our proposed method very fast, robust and successfully implemented to service robot called Srikandi II that equipped with 4 DOF arm robot developed in our laboratory.

  18. A Full Bayesian Approach for Boolean Genetic Network Inference

    Science.gov (United States)

    Han, Shengtong; Wong, Raymond K. W.; Lee, Thomas C. M.; Shen, Linghao; Li, Shuo-Yen R.; Fan, Xiaodan

    2014-01-01

    Boolean networks are a simple but efficient model for describing gene regulatory systems. A number of algorithms have been proposed to infer Boolean networks. However, these methods do not take full consideration of the effects of noise and model uncertainty. In this paper, we propose a full Bayesian approach to infer Boolean genetic networks. Markov chain Monte Carlo algorithms are used to obtain the posterior samples of both the network structure and the related parameters. In addition to regular link addition and removal moves, which can guarantee the irreducibility of the Markov chain for traversing the whole network space, carefully constructed mixture proposals are used to improve the Markov chain Monte Carlo convergence. Both simulations and a real application on cell-cycle data show that our method is more powerful than existing methods for the inference of both the topology and logic relations of the Boolean network from observed data. PMID:25551820

  19. A full bayesian approach for boolean genetic network inference.

    Directory of Open Access Journals (Sweden)

    Shengtong Han

    Full Text Available Boolean networks are a simple but efficient model for describing gene regulatory systems. A number of algorithms have been proposed to infer Boolean networks. However, these methods do not take full consideration of the effects of noise and model uncertainty. In this paper, we propose a full Bayesian approach to infer Boolean genetic networks. Markov chain Monte Carlo algorithms are used to obtain the posterior samples of both the network structure and the related parameters. In addition to regular link addition and removal moves, which can guarantee the irreducibility of the Markov chain for traversing the whole network space, carefully constructed mixture proposals are used to improve the Markov chain Monte Carlo convergence. Both simulations and a real application on cell-cycle data show that our method is more powerful than existing methods for the inference of both the topology and logic relations of the Boolean network from observed data.

  20. Estimating parameters in stochastic systems: A variational Bayesian approach

    Science.gov (United States)

    Vrettas, Michail D.; Cornford, Dan; Opper, Manfred

    2011-11-01

    This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.

  1. Bayesian network approach for modeling local failure in lung cancer

    Science.gov (United States)

    Oh, Jung Hun; Craft, Jeffrey; Al-Lozi, Rawan; Vaidya, Manushka; Meng, Yifan; Deasy, Joseph O; Bradley, Jeffrey D; Naqa, Issam El

    2011-01-01

    Locally advanced non-small cell lung cancer (NSCLC) patients suffer from a high local failure rate following radiotherapy. Despite many efforts to develop new dose-volume models for early detection of tumor local failure, there was no reported significant improvement in their application prospectively. Based on recent studies of biomarker proteins’ role in hypoxia and inflammation in predicting tumor response to radiotherapy, we hypothesize that combining physical and biological factors with a suitable framework could improve the overall prediction. To test this hypothesis, we propose a graphical Bayesian network framework for predicting local failure in lung cancer. The proposed approach was tested using two different datasets of locally advanced NSCLC patients treated with radiotherapy. The first dataset was collected retrospectively, which is comprised of clinical and dosimetric variables only. The second dataset was collected prospectively in which in addition to clinical and dosimetric information, blood was drawn from the patients at various time points to extract candidate biomarkers as well. Our preliminary results show that the proposed method can be used as an efficient method to develop predictive models of local failure in these patients and to interpret relationships among the different variables in the models. We also demonstrate the potential use of heterogenous physical and biological variables to improve the model prediction. With the first dataset, we achieved better performance compared with competing Bayesian-based classifiers. With the second dataset, the combined model had a slightly higher performance compared to individual physical and biological models, with the biological variables making the largest contribution. Our preliminary results highlight the potential of the proposed integrated approach for predicting post-radiotherapy local failure in NSCLC patients. PMID:21335651

  2. A Variational Bayesian Approach to Multiframe Image Restoration.

    Science.gov (United States)

    Sonogashira, Motoharu; Funatomi, Takuya; Iiyama, Masaaki; Minoh, Michihiko

    2017-03-06

    Image restoration is a fundamental problem in the field of image processing. The key objective of image restoration is to recover clean images from images degraded by noise and blur. Recently, a family of new statistical techniques called variational Bayes (VB) has been introduced to image restoration, which enables us to automatically tune parameters that control restoration. While information from one image is often insufficient for high-quality restoration, however, current state-of-theart methods of image restoration via VB approaches use only a single degraded image to recover a clean image. In this paper, we propose a novel method of multiframe image restoration via a VB approach, which can achieve higher image quality while tuning parameters automatically. Given multiple degraded images, this method jointly estimates a clean image and other parameters, including an image warping parameter introduced for the use of multiple images, through Bayesian inference that we enable by making full use of VB techniques. Through various experiments, we demonstrate the effectiveness of our multiframe method by comparing it with single-frame one, and also show the advantages of our VB approach over non-VB approaches.

  3. Accurate characterization of weak neutron fields by using a Bayesian approach.

    Science.gov (United States)

    Medkour Ishak-Boushaki, G; Allab, M

    2017-04-01

    A Bayesian analysis of data derived from neutron spectrometric measurements provides the advantage of determining rigorously integral physical quantities characterizing the neutron field and their respective related uncertainties. The first and essential step in a Bayesian approach is the parameterization of the investigated neutron spectrum. The aim of this paper is to investigate the sensitivity of the Bayesian results, mainly the neutron dose H(*)(10) required for radiation protection purposes and its correlated uncertainty, to the selected neutron spectrum parameterization.

  4. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  5. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  6. Point and Interval Estimation on the Degree and the Angle of Polarization. A Bayesian approach

    CERN Document Server

    Maier, Daniel; Santangelo, Andrea

    2014-01-01

    Linear polarization measurements provide access to two quantities, the degree (DOP) and the angle of polarization (AOP). The aim of this work is to give a complete and concise overview of how to analyze polarimetric measurements. We review interval estimations for the DOP with a frequentist and a Bayesian approach. Point estimations for the DOP and interval estimations for the AOP are further investigated with a Bayesian approach to match observational needs. Point and interval estimations are calculated numerically for frequentist and Bayesian statistics. Monte Carlo simulations are performed to clarify the meaning of the calculations. Under observational conditions, the true DOP and AOP are unknown, so that classical statistical considerations - based on true values - are not directly usable. In contrast, Bayesian statistics handles unknown true values very well and produces point and interval estimations for DOP and AOP, directly. Using a Bayesian approach, we show how to choose DOP point estimations based...

  7. Modelling of population dynamics of red king crab using Bayesian approach

    Directory of Open Access Journals (Sweden)

    Bakanev Sergey ...

    2012-10-01

    Modeling population dynamics based on the Bayesian approach enables to successfully resolve the above issues. The integration of the data from various studies into a unified model based on Bayesian parameter estimation method provides a much more detailed description of the processes occurring in the population.

  8. A Bayesian approach to extracting meaning from system behavior

    Energy Technology Data Exchange (ETDEWEB)

    Dress, W.B.

    1998-08-01

    The modeling relation and its reformulation to include the semiotic hierarchy is essential for the understanding, control, and successful re-creation of natural systems. This presentation will argue for a careful application of Rosen`s modeling relationship to the problems of intelligence and autonomy in natural and artificial systems. To this end, the authors discuss the essential need for a correct theory of induction, learning, and probability; and suggest that modern Bayesian probability theory, developed by Cox, Jaynes, and others, can adequately meet such demands, especially on the operational level of extracting meaning from observations. The methods of Bayesian and maximum Entropy parameter estimation have been applied to measurements of system observables to directly infer the underlying differential equations generating system behavior. This approach by-passes the usual method of parameter estimation based on assuming a functional form for the observable and then estimating the parameters that would lead to the particular observed behavior. The computational savings is great since only location parameters enter into the maximum-entropy calculations; this innovation finesses the need for nonlinear parameters altogether. Such an approach more directly extracts the semantics inherent in a given system by going to the root of system meaning as expressed by abstract form or shape, rather than in syntactic particulars, such as signal amplitude and phase. Examples will be shown how the form of a system can be followed while ignoring unnecessary details. In this sense, the authors are observing the meaning of the words rather than being concerned with their particular expression or language. For the present discussion, empirical models are embodied by the differential equations underlying, producing, or describing the behavior of a process as measured or tracked by a particular variable set--the observables. The a priori models are probability structures that

  9. Diagnosing Hybrid Systems: a Bayesian Model Selection Approach

    Science.gov (United States)

    McIlraith, Sheila A.

    2005-01-01

    In this paper we examine the problem of monitoring and diagnosing noisy complex dynamical systems that are modeled as hybrid systems-models of continuous behavior, interleaved by discrete transitions. In particular, we examine continuous systems with embedded supervisory controllers that experience abrupt, partial or full failure of component devices. Building on our previous work in this area (MBCG99;MBCG00), our specific focus in this paper ins on the mathematical formulation of the hybrid monitoring and diagnosis task as a Bayesian model tracking algorithm. The nonlinear dynamics of many hybrid systems present challenges to probabilistic tracking. Further, probabilistic tracking of a system for the purposes of diagnosis is problematic because the models of the system corresponding to failure modes are numerous and generally very unlikely. To focus tracking on these unlikely models and to reduce the number of potential models under consideration, we exploit logic-based techniques for qualitative model-based diagnosis to conjecture a limited initial set of consistent candidate models. In this paper we discuss alternative tracking techniques that are relevant to different classes of hybrid systems, focusing specifically on a method for tracking multiple models of nonlinear behavior simultaneously using factored sampling and conditional density propagation. To illustrate and motivate the approach described in this paper we examine the problem of monitoring and diganosing NASA's Sprint AERCam, a small spherical robotic camera unit with 12 thrusters that enable both linear and rotational motion.

  10. A Bayesian decision approach to rainfall thresholds based flood warning

    Directory of Open Access Journals (Sweden)

    M. L. V. Martina

    2006-01-01

    Full Text Available Operational real time flood forecasting systems generally require a hydrological model to run in real time as well as a series of hydro-informatics tools to transform the flood forecast into relatively simple and clear messages to the decision makers involved in flood defense. The scope of this paper is to set forth the possibility of providing flood warnings at given river sections based on the direct comparison of the quantitative precipitation forecast with critical rainfall threshold values, without the need of an on-line real time forecasting system. This approach leads to an extremely simplified alert system to be used by non technical stakeholders and could also be used to supplement the traditional flood forecasting systems in case of system failures. The critical rainfall threshold values, incorporating the soil moisture initial conditions, result from statistical analyses using long hydrological time series combined with a Bayesian utility function minimization. In the paper, results of an application of the proposed methodology to the Sieve river, a tributary of the Arno river in Italy, are given to exemplify its practical applicability.

  11. A Bayesian Approach for Structural Learning with Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Cen Li

    2002-01-01

    Full Text Available Hidden Markov Models(HMM have proved to be a successful modeling paradigm for dynamic and spatial processes in many domains, such as speech recognition, genomics, and general sequence alignment. Typically, in these applications, the model structures are predefined by domain experts. Therefore, the HMM learning problem focuses on the learning of the parameter values of the model to fit the given data sequences. However, when one considers other domains, such as, economics and physiology, model structure capturing the system dynamic behavior is not available. In order to successfully apply the HMM methodology in these domains, it is important that a mechanism is available for automatically deriving the model structure from the data. This paper presents a HMM learning procedure that simultaneously learns the model structure and the maximum likelihood parameter values of a HMM from data. The HMM model structures are derived based on the Bayesian model selection methodology. In addition, we introduce a new initialization procedure for HMM parameter value estimation based on the K-means clustering method. Experimental results with artificially generated data show the effectiveness of the approach.

  12. A bayesian approach to classification criteria for spectacled eiders

    Science.gov (United States)

    Taylor, B.L.; Wade, P.R.; Stehn, R.A.; Cochrane, J.F.

    1996-01-01

    To facilitate decisions to classify species according to risk of extinction, we used Bayesian methods to analyze trend data for the Spectacled Eider, an arctic sea duck. Trend data from three independent surveys of the Yukon-Kuskokwim Delta were analyzed individually and in combination to yield posterior distributions for population growth rates. We used classification criteria developed by the recovery team for Spectacled Eiders that seek to equalize errors of under- or overprotecting the species. We conducted both a Bayesian decision analysis and a frequentist (classical statistical inference) decision analysis. Bayesian decision analyses are computationally easier, yield basically the same results, and yield results that are easier to explain to nonscientists. With the exception of the aerial survey analysis of the 10 most recent years, both Bayesian and frequentist methods indicated that an endangered classification is warranted. The discrepancy between surveys warrants further research. Although the trend data are abundance indices, we used a preliminary estimate of absolute abundance to demonstrate how to calculate extinction distributions using the joint probability distributions for population growth rate and variance in growth rate generated by the Bayesian analysis. Recent apparent increases in abundance highlight the need for models that apply to declining and then recovering species.

  13. A Bayesian approach to estimating causal vaccine effects on binary post-infection outcomes.

    Science.gov (United States)

    Zhou, Jincheng; Chu, Haitao; Hudgens, Michael G; Halloran, M Elizabeth

    2016-01-15

    To estimate causal effects of vaccine on post-infection outcomes, Hudgens and Halloran (2006) defined a post-infection causal vaccine efficacy estimand VEI based on the principal stratification framework. They also derived closed forms for the maximum likelihood estimators of the causal estimand under some assumptions. Extending their research, we propose a Bayesian approach to estimating the causal vaccine effects on binary post-infection outcomes. The identifiability of the causal vaccine effect VEI is discussed under different assumptions on selection bias. The performance of the proposed Bayesian method is compared with the maximum likelihood method through simulation studies and two case studies - a clinical trial of a rotavirus vaccine candidate and a field study of pertussis vaccination. For both case studies, the Bayesian approach provided similar inference as the frequentist analysis. However, simulation studies with small sample sizes suggest that the Bayesian approach provides smaller bias and shorter confidence interval length.

  14. Prediction of road accidents: A Bayesian hierarchical approach

    DEFF Research Database (Denmark)

    Deublein, Markus; Schubert, Matthias; Adey, Bryan T.;

    2013-01-01

    In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson......-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks...... in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models.Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis...

  15. Nursing Home Care Quality: Insights from a Bayesian Network Approach

    Science.gov (United States)

    Goodson, Justin; Jang, Wooseung; Rantz, Marilyn

    2008-01-01

    Purpose: The purpose of this research is twofold. The first purpose is to utilize a new methodology (Bayesian networks) for aggregating various quality indicators to measure the overall quality of care in nursing homes. The second is to provide new insight into the relationships that exist among various measures of quality and how such measures…

  16. A Bayesian network approach to coastal storm impact modeling

    NARCIS (Netherlands)

    Jäger, W.S.; Den Heijer, C.; Bolle, A.; Hanea, A.M.

    2015-01-01

    In this paper we develop a Bayesian network (BN) that relates offshore storm conditions to their accompagnying flood characteristics and damages to residential buildings, following on the trend of integrated flood impact modeling. It is based on data from hydrodynamic storm simulations, information

  17. Multisensor-multitarget sensor management: a unified Bayesian approach

    Science.gov (United States)

    Mahler, Ronald P. S.

    2003-08-01

    Multisensor-multitarget sensor management is at root a problem in nonlinear control theory. This paper develops a potentially computationally tractable approximation of an earlier (1996) Bayesian control-theoretic foundation for sensor management based on "finite-set statistics" (FISST) and the Bayes recursive filter for the entire multisensor-multitarget system. I analyze possible Bayesian control-theoretic objective functions: Csiszar information-theoretic functionals (which generalize Kullback-Leibler discrimination) and "geometric" functionals. I show that some of these objective functions lead to potentially tractable sensor management algorithms when used in conjunction with MHC (multi-hypothesis correlator)-like algorithms. I also take this opportunity to comment on recent misrepresentations of FISST involving so-called "joint multitarget probabilities (JMP).".

  18. Bayesian penalized log-likelihood ratio approach for dose response clinical trial studies.

    Science.gov (United States)

    Tang, Yuanyuan; Cai, Chunyan; Sun, Liangrui; He, Jianghua

    2017-02-13

    In literature, there are a few unified approaches to test proof of concept and estimate a target dose, including the multiple comparison procedure using modeling approach, and the permutation approach proposed by Klingenberg. We discuss and compare the operating characteristics of these unified approaches and further develop an alternative approach in a Bayesian framework based on the posterior distribution of a penalized log-likelihood ratio test statistic. Our Bayesian approach is much more flexible to handle linear or nonlinear dose-response relationships and is more efficient than the permutation approach. The operating characteristics of our Bayesian approach are comparable to and sometimes better than both approaches in a wide range of dose-response relationships. It yields credible intervals as well as predictive distribution for the response rate at a specific dose level for the target dose estimation. Our Bayesian approach can be easily extended to continuous, categorical, and time-to-event responses. We illustrate the performance of our proposed method with extensive simulations and Phase II clinical trial data examples.

  19. A Bayesian-style approach to estimating LISA science capability

    Science.gov (United States)

    Baker, John; Marsat, Sylvain

    2017-01-01

    A full understanding of LISA's science capability will require accurate models of incident waveform signals and the instrumental response. While Fisher matrix analysis is useful for some estimates, a Bayesian characterization of simulated probability distributions is needed for understanding important cases at the limit of LISA's capability. We apply fast analysis algorithms enabling accurate treatment using EOB waveforms with relevant higher modes and the full-featured LISA response to study these aspects of LISA science capability. Supported by NASA grant 11-ATP-046.

  20. A Bayesian Approach for Segmentation in Stereo Image Sequences

    Directory of Open Access Journals (Sweden)

    Tzovaras Dimitrios

    2002-01-01

    Full Text Available Stereoscopic image sequence processing has been the focus of considerable attention in recent literature for videoconference applications. A novel Bayesian scheme is proposed in this paper, for the segmentation of a noisy stereoscopic image sequence. More specifically, occlusions and visible foreground and background regions are detected between the left and the right frame while the uncovered-background areas are identified between two successive frames of the sequence. Combined hypotheses are used for the formulation of the Bayes decision rule which employs a single intensity-difference measurement at each pixel. Experimental results illustrating the performance of the proposed technique are presented and evaluated in videoconference applications.

  1. Bayesian approach to cyclic activity of CF Oct

    CERN Document Server

    Borisova, Ana P; Innis, John L

    2011-01-01

    Bayesian statistical methods of Gregory-Loredo and the Bretthorst generalization of the Lomb-Scargle periodogram have been applied for studying activity cycles of the early K-type subgiant star CF Oct. We have used a ~45 year long dataset derived from archival photographic observations, published photoelectric photometry, Hipparcos data series and All Sky Automated Survey archive. We have confirmed the already known rotational period for the star of 20.16 d and have shown evidences that it has exhibited changes from 19.90 d to 20.45 d. This is an indication for stellar surface differential rotation.The Bayesian magnitude and time--residual analysis reveals clearly at least one long-term cycle. The cycle lenght's posterior distributions appear to be multimodal with a pronounced peak at a period of 7.1 y with FWHM of 54 d for time-residuals and at a period of 9.8 y with FWHM of 184 d for magitude data. These results are consistent with the previously postulated cycle of 9+/-3 years.

  2. MODELING INFORMATION SYSTEM AVAILABILITY BY USING BAYESIAN BELIEF NETWORK APPROACH

    Directory of Open Access Journals (Sweden)

    Semir Ibrahimović

    2016-03-01

    Full Text Available Modern information systems are expected to be always-on by providing services to end-users, regardless of time and location. This is particularly important for organizations and industries where information systems support real-time operations and mission-critical applications that need to be available on 24  7  365 basis. Examples of such entities include process industries, telecommunications, healthcare, energy, banking, electronic commerce and a variety of cloud services. This article presents a modified Bayesian Belief Network model for predicting information system availability, introduced initially by Franke, U. and Johnson, P. (in article “Availability of enterprise IT systems – an expert based Bayesian model”. Software Quality Journal 20(2, 369-394, 2012 based on a thorough review of several dimensions of the information system availability, we proposed a modified set of determinants. The model is parameterized by using probability elicitation process with the participation of experts from the financial sector of Bosnia and Herzegovina. The model validation was performed using Monte Carlo simulation.

  3. A Bayesian approach to matched field processing in uncertain ocean environments

    Institute of Scientific and Technical Information of China (English)

    LI Jianlong; PAN Xiang

    2008-01-01

    An approach of Bayesian Matched Field Processing(MFP)was discussed in the uncertain ocean environment.In this approach,uncertainty knowledge is modeled and spatial and temporal data Received by the array are fully used.Therefore,a mechanism for MFP is found.which well combines model-based and data-driven methods of uncertain field processing.By theoretical derivation,simulation analysis and the validation of the experimental array data at sea,we find that(1)the basic components of Bayesian matched field processors are the corresponding sets of Bartlett matched field processor,MVDR(minimum variance distortionless response)matched field processor,etc.;(2)Bayesian MVDR/Bartlett MFP are the weighted sum of the MVDR/Bartlett MFP,where the weighted coefficients are the values of the a posteriori probability;(3)with the uncertain ocean environment,Bayesian MFP can more correctly locate the source than MVDR MFP or Bartlett MFP;(4)Bayesian MFP call better suppress sidelobes of the ambiguity surfaces.

  4. A Bayesian approach to linear regression in astronomy

    CERN Document Server

    Sereno, Mauro

    2015-01-01

    Linear regression is common in astronomical analyses. I discuss a Bayesian hierarchical modeling of data with heteroscedastic and possibly correlated measurement errors and intrinsic scatter. The method fully accounts for time evolution. The slope, the normalization, and the intrinsic scatter of the relation can evolve with the redshift. The intrinsic distribution of the independent variable is approximated using a mixture of Gaussian distributions whose means and standard deviations depend on time. The method can address scatter in the measured independent variable (a kind of Eddington bias), selection effects in the response variable (Malmquist bias), and departure from linearity in form of a knee. I tested the method with toy models and simulations and quantified the effect of biases and inefficient modeling. The R-package LIRA (LInear Regression in Astronomy) is made available to perform the regression.

  5. Equifinality of formal (DREAM) and informal (GLUE) bayesian approaches in hydrologic modeling?

    Energy Technology Data Exchange (ETDEWEB)

    Vrugt, Jasper A [Los Alamos National Laboratory; Robinson, Bruce A [Los Alamos National Laboratory; Ter Braak, Cajo J F [NON LANL; Gupta, Hoshin V [NON LANL

    2008-01-01

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context, or whether such a framework should be based on a different philosophy and implement informal measures and weaker inference to summarize parameter and predictive distributions. In this paper, we compare a formal Bayesian approach using Markov Chain Monte Carlo (MCMC) with generalized likelihood uncertainty estimation (GLUE) for assessing uncertainty in conceptual watershed modeling. Our formal Bayesian approach is implemented using the recently developed differential evolution adaptive metropolis (DREAM) MCMC scheme with a likelihood function that explicitly considers model structural, input and parameter uncertainty. Our results demonstrate that DREAM and GLUE can generate very similar estimates of total streamflow uncertainty. This suggests that formal and informal Bayesian approaches have more common ground than the hydrologic literature and ongoing debate might suggest. The main advantage of formal approaches is, however, that they attempt to disentangle the effect of forcing, parameter and model structural error on total predictive uncertainty. This is key to improving hydrologic theory and to better understand and predict the flow of water through catchments.

  6. A Robust Bayesian Approach for Structural Equation Models with Missing Data

    Science.gov (United States)

    Lee, Sik-Yum; Xia, Ye-Mao

    2008-01-01

    In this paper, normal/independent distributions, including but not limited to the multivariate t distribution, the multivariate contaminated distribution, and the multivariate slash distribution, are used to develop a robust Bayesian approach for analyzing structural equation models with complete or missing data. In the context of a nonlinear…

  7. A Bayesian Approach for Nonlinear Structural Equation Models with Dichotomous Variables Using Logit and Probit Links

    Science.gov (United States)

    Lee, Sik-Yum; Song, Xin-Yuan; Cai, Jing-Heng

    2010-01-01

    Analysis of ordered binary and unordered binary data has received considerable attention in social and psychological research. This article introduces a Bayesian approach, which has several nice features in practical applications, for analyzing nonlinear structural equation models with dichotomous data. We demonstrate how to use the software…

  8. A Bayesian hierarchical diffusion model decomposition of performance in Approach-Avoidance Tasks.

    Science.gov (United States)

    Krypotos, Angelos-Miltiadis; Beckers, Tom; Kindt, Merel; Wagenmakers, Eric-Jan

    2015-01-01

    Common methods for analysing response time (RT) tasks, frequently used across different disciplines of psychology, suffer from a number of limitations such as the failure to directly measure the underlying latent processes of interest and the inability to take into account the uncertainty associated with each individual's point estimate of performance. Here, we discuss a Bayesian hierarchical diffusion model and apply it to RT data. This model allows researchers to decompose performance into meaningful psychological processes and to account optimally for individual differences and commonalities, even with relatively sparse data. We highlight the advantages of the Bayesian hierarchical diffusion model decomposition by applying it to performance on Approach-Avoidance Tasks, widely used in the emotion and psychopathology literature. Model fits for two experimental data-sets demonstrate that the model performs well. The Bayesian hierarchical diffusion model overcomes important limitations of current analysis procedures and provides deeper insight in latent psychological processes of interest.

  9. Poor-data and data-poor species stock assessment using a Bayesian hierarchical approach.

    Science.gov (United States)

    Jiao, Yan; Cortés, Enric; Andrews, Kate; Guo, Feng

    2011-10-01

    Appropriate inference for stocks or species with low-quality data (poor data) or limited data (data poor) is extremely important. Hierarchical Bayesian methods are especially applicable to small-area, small-sample-size estimation problems because they allow poor-data species to borrow strength from species with good-quality data. We used a hammerhead shark complex as an example to investigate the advantages of using hierarchical Bayesian models in assessing the status of poor-data and data-poor exploited species. The hammerhead shark complex (Sphyrna spp.) along the Atlantic and Gulf of Mexico coasts of the United States is composed of three species: the scalloped hammerhead (S. lewini), the great hammerhead (S. mokarran), and the smooth hammerhead (S. zygaena) sharks. The scalloped hammerhead comprises 70-80% of the catch and has catch and relative abundance data of good quality, whereas great and smooth hammerheads have relative abundance indices that are both limited and of low quality presumably because of low stock density and limited sampling. Four hierarchical Bayesian state-space surplus production models were developed to simulate variability in population growth rates, carrying capacity, and catchability of the species. The results from the hierarchical Bayesian models were considerably more robust than those of the nonhierarchical models. The hierarchical Bayesian approach represents an intermediate strategy between traditional models that assume different population parameters for each species and those that assume all species share identical parameters. Use of the hierarchical Bayesian approach is suggested for future hammerhead shark stock assessments and for modeling fish complexes with species-specific data, because the poor-data species can borrow strength from the species with good data, making the estimation more stable and robust.

  10. A bayesian approach to deformed pattern matching of iris images.

    Science.gov (United States)

    Thornton, Jason; Savvides, Marios; Vijaya Kumar, B V K

    2007-04-01

    We describe a general probabilistic framework for matching patterns that experience in-plane nonlinear deformations, such as iris patterns. Given a pair of images, we derive a maximum a posteriori probability (MAP) estimate of the parameters of the relative deformation between them. Our estimation process accomplishes two things simultaneously: It normalizes for pattern warping and it returns a distortion-tolerant similarity metric which can be used for matching two nonlinearly deformed image patterns. The prior probability of the deformation parameters is specific to the pattern-type and, therefore, should result in more accurate matching than an arbitrary general distribution. We show that the proposed method is very well suited for handling iris biometrics, applying it to two databases of iris images which contain real instances of warped patterns. We demonstrate a significant improvement in matching accuracy using the proposed deformed Bayesian matching methodology. We also show that the additional computation required to estimate the deformation is relatively inexpensive, making it suitable for real-time applications.

  11. A Bayesian Approach to Period Searching in Solar Coronal Loops

    Science.gov (United States)

    Scherrer, Bryan; McKenzie, David

    2017-03-01

    We have applied a Bayesian generalized Lomb–Scargle period searching algorithm to movies of coronal loop images obtained with the Hinode X-ray Telescope (XRT) to search for evidence of periodicities that would indicate resonant heating of the loops. The algorithm makes as its only assumption that there is a single sinusoidal signal within each light curve of the data. Both the amplitudes and noise are taken as free parameters. It is argued that this procedure should be used alongside Fourier and wavelet analyses to more accurately extract periodic intensity modulations in coronal loops. The data analyzed are from XRT Observation Program #129C: “MHD Wave Heating (Thin Filters),” which occurred during 2006 November 13 and focused on active region 10293, which included coronal loops. The first data set spans approximately 10 min with an average cadence of 2 s, 2″ per pixel resolution, and used the Al-mesh analysis filter. The second data set spans approximately 4 min with a 3 s average cadence, 1″ per pixel resolution, and used the Al-poly analysis filter. The final data set spans approximately 22 min at a 6 s average cadence, and used the Al-poly analysis filter. In total, 55 periods of sinusoidal coronal loop oscillations between 5.5 and 59.6 s are discussed, supporting proposals in the literature that resonant absorption of magnetic waves is a viable mechanism for depositing energy in the corona.

  12. Sparsely Sampling the Sky: A Bayesian Experimental Design Approach

    CERN Document Server

    Paykari, P

    2012-01-01

    The next generation of galaxy surveys will observe millions of galaxies over large volumes of the universe. These surveys are expensive both in time and cost, raising questions regarding the optimal investment of this time and money. In this work we investigate criteria for selecting amongst observing strategies for constraining the galaxy power spectrum and a set of cosmological parameters. Depending on the parameters of interest, it may be more efficient to observe a larger, but sparsely sampled, area of sky instead of a smaller contiguous area. In this work, by making use of the principles of Bayesian Experimental Design, we will investigate the advantages and disadvantages of the sparse sampling of the sky and discuss the circumstances in which a sparse survey is indeed the most efficient strategy. For the Dark Energy Survey (DES), we find that by sparsely observing the same area in a smaller amount of time, we only increase the errors on the parameters by a maximum of 0.45%. Conversely, investing the sam...

  13. Improving standard practices for prediction in ungauged basins: Bayesian approach

    Science.gov (United States)

    Prieto, Cristina; Le-Vine, Nataliya; García, Eduardo; Medina, Raúl

    2015-04-01

    In hydrological modelling, the prediction of flows in ungauged basins is still a defiance. Among the different alternatives to quantify and reduce the uncertainty in the predictions, a Bayesian framework has proven to be advantageous. This framework allows flow prediction in ungauged basins based on regionalised hydrological indices. Being grounded on probability theory, the procedure requires a number of assumptions and decisions to be made. Among the most important ones are 1) selection of representative hydrological signatures, 2) selection of regionalization model functional form, and 3) a 'perfect' model/ input assumption. The contribution of this research is to address these three assumptions. First, to reduce an extensive set of available hydrological signatures we select a compact orthogonal set of information pieces using Principal Component Analysis. This advances the standard practice of semi-empirical selection of individual hydrological signatures. Second, we use functional-form-assumption-free Random Forests to regionalize the selected information. This allows the traditional assumption of linear regression between catchment properties and characteristics of hydrological response to be relaxes. And third, we propose utilizing non-traditional metrics to flag-up possible model/ input errors: Bayes' Factor and a newly-proposed 'Suitability' test. This addresses the typical non-realistic assumption that model is 'perfect' and the input is noise-free. The proposed methodological developments are illustrated for the empirical challenge of flow prediction in rivers in Northern Spain.

  14. Finding Clocks in Genes: A Bayesian Approach to Estimate Periodicity

    Directory of Open Access Journals (Sweden)

    Yan Ren

    2016-01-01

    Full Text Available Identification of rhythmic gene expression from metabolic cycles to circadian rhythms is crucial for understanding the gene regulatory networks and functions of these biological processes. Recently, two algorithms, JTK_CYCLE and ARSER, have been developed to estimate periodicity of rhythmic gene expression. JTK_CYCLE performs well for long or less noisy time series, while ARSER performs well for detecting a single rhythmic category. However, observing gene expression at high temporal resolution is not always feasible, and many scientists are interested in exploring both ultradian and circadian rhythmic categories simultaneously. In this paper, a new algorithm, named autoregressive Bayesian spectral regression (ABSR, is proposed. It estimates the period of time-course experimental data and classifies gene expression profiles into multiple rhythmic categories simultaneously. Through the simulation studies, it is shown that ABSR substantially improves the accuracy of periodicity estimation and clustering of rhythmic categories as compared to JTK_CYCLE and ARSER for the data with low temporal resolution. Moreover, ABSR is insensitive to rhythmic patterns. This new scheme is applied to existing time-course mouse liver data to estimate period of rhythms and classify the genes into ultradian, circadian, and arrhythmic categories. It is observed that 49.2% of the circadian profiles detected by JTK_CYCLE with 1-hour resolution are also detected by ABSR with only 4-hour resolution.

  15. Approximate Bayesian Computation in hydrologic modeling: equifinality of formal and informal approaches

    Directory of Open Access Journals (Sweden)

    M. Sadegh

    2013-04-01

    Full Text Available In recent years, a strong debate has emerged in the hydrologic literature how to properly treat non-traditional error residual distributions and quantify parameter and predictive uncertainty. Particularly, there is strong disagreement whether such uncertainty framework should have its roots within a proper statistical (Bayesian context using Markov chain Monte Carlo (MCMC simulation techniques, or whether such a framework should be based on a quite different philosophy and implement informal likelihood functions and simplistic search methods to summarize parameter and predictive distributions. In this paper we introduce an alternative framework, called Approximate Bayesian Computation (ABC that summarizes the differing viewpoints of formal and informal Bayesian approaches. This methodology has recently emerged in the fields of biology and population genetics and relaxes the need for an explicit likelihood function in favor of one or multiple different summary statistics that measure the distance of each model simulation to the data. This paper is a follow up of the recent publication of Nott et al. (2012 and further studies the theoretical and numerical equivalence of formal (DREAM and informal (GLUE Bayesian approaches using data from different watersheds in the United States. We demonstrate that the limits of acceptability approach of GLUE is a special variant of ABC in which each discharge observation of the calibration data set is used as a summary diagnostic.

  16. A Bayesian approach for characterization of soft tissue viscoelasticity in acoustic radiation force imaging.

    Science.gov (United States)

    Zhao, Xiaodong; Pelegri, Assimina A

    2016-04-01

    Biomechanical imaging techniques based on acoustic radiation force (ARF) have been developed to characterize the viscoelasticity of soft tissue by measuring the motion excited by ARF non-invasively. The unknown stress distribution in the region of excitation limits an accurate inverse characterization of soft tissue viscoelasticity, and single degree-of-freedom simplified models have been applied to solve the inverse problem approximately. In this study, the ARF-induced creep imaging is employed to estimate the time constant of a Voigt viscoelastic tissue model, and an inverse finite element (FE) characterization procedure based on a Bayesian formulation is presented. The Bayesian approach aims to estimate a reasonable quantification of the probability distributions of soft tissue mechanical properties in the presence of measurement noise and model parameter uncertainty. Gaussian process metamodeling is applied to provide a fast statistical approximation based on a small number of computationally expensive FE model runs. Numerical simulation results demonstrate that the Bayesian approach provides an efficient and practical estimation of the probability distributions of time constant in the ARF-induced creep imaging. In a comparison study with the single degree of freedom models, the Bayesian approach with FE models improves the estimation results even in the presence of large uncertainty levels of the model parameters.

  17. A Comparison of Hierarchical and Non-Hierarchical Bayesian Approaches for Fitting Allometric Larch (Larix.spp. Biomass Equations

    Directory of Open Access Journals (Sweden)

    Dongsheng Chen

    2016-01-01

    Full Text Available Accurate biomass estimations are important for assessing and monitoring forest carbon storage. Bayesian theory has been widely applied to tree biomass models. Recently, a hierarchical Bayesian approach has received increasing attention for improving biomass models. In this study, tree biomass data were obtained by sampling 310 trees from 209 permanent sample plots from larch plantations in six regions across China. Non-hierarchical and hierarchical Bayesian approaches were used to model allometric biomass equations. We found that the total, root, stem wood, stem bark, branch and foliage biomass model relationships were statistically significant (p-values < 0.001 for both the non-hierarchical and hierarchical Bayesian approaches, but the hierarchical Bayesian approach increased the goodness-of-fit statistics over the non-hierarchical Bayesian approach. The R2 values of the hierarchical approach were higher than those of the non-hierarchical approach by 0.008, 0.018, 0.020, 0.003, 0.088 and 0.116 for the total tree, root, stem wood, stem bark, branch and foliage models, respectively. The hierarchical Bayesian approach significantly improved the accuracy of the biomass model (except for the stem bark and can reflect regional differences by using random parameters to improve the regional scale model accuracy.

  18. Bayesian Approach in Estimation of Scale Parameter of Nakagami Distribution

    Directory of Open Access Journals (Sweden)

    Azam Zaka

    2014-08-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE Nakagami distribution is a flexible life time distribution that may offer a good fit to some failure data sets. It has applications in attenuation of wireless signals traversing multiple paths, deriving unit hydrographs in hydrology, medical imaging studies etc. In this research, we obtain Bayesian estimators of the scale parameter of Nakagami distribution. For the posterior distribution of this parameter, we consider Uniform, Inverse Exponential and Levy priors. The three loss functions taken up are Squared Error Loss function, Quadratic Loss Function and Precautionary Loss function. The performance of an estimator is assessed on the basis of its relative posterior risk. Monte Carlo Simulations are used to compare the performance of the estimators. It is discovered that the PLF produces the least posterior risk when uniform priors is used. SELF is the best when inverse exponential and Levy Priors are used. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

  19. A Bayesian network approach to the database search problem in criminal proceedings

    Directory of Open Access Journals (Sweden)

    Biedermann Alex

    2012-08-01

    Full Text Available Abstract Background The ‘database search problem’, that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain, this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional

  20. Approaches in highly parameterized inversion: bgaPEST, a Bayesian geostatistical approach implementation with PEST: documentation and instructions

    Science.gov (United States)

    Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.

    2013-01-01

    The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.

  1. Bayesian Filtering Approaches for Detecting Anomalies in Environmental Sensor Data

    Science.gov (United States)

    Hill, D. J.; Minsker, B. S.

    2006-12-01

    Recent advances in sensor technology are facilitating the deployment of sensors into the environment that can produce measurements at high spatial and/or temporal resolutions. Not only can these data be used to better characterize the system for improved modeling, but they can also be used to produce better understandings of the mechanisms of environmental processes. One such use of these data is anomaly detection to identify data that deviate from historical patterns. These anomalous data can be caused by sensor or data transmission errors or by infrequent system behaviors that are often of interest to the scientific or public safety communities. Thus, anomaly detection has many practical applications, such as data quality assurance and control (QA/QC), where anomalous data are treated as data errors; focused data collection, where anomalous data indicate segments of data that are of interest to researchers; or event detection, where anomalous data signal system behaviors that could result in a natural disaster, for example. Traditionally, most anomaly detection has been carried out manually with the assistance of data visualization tools; however, due to the large volume of data produced by environmental sensors, manual techniques are not always feasible. This study develops an automated anomaly detection method that employs dynamic Bayesian networks (DBNs) to model the states of the environmental system in which the sensors are deployed. The DBN is an artificial intelligence technique that models the evolution of the discrete and/or continuous valued states of a dynamic system by tracking changes in the system states over time. Two commonly used types of DBNs are hidden Markov models and Kalman filters. In this study, DBNs will be used to predict the expected value of unknown system states, as well as the likelihood of particular sensor measurements of those states. Unlikely measurements are then considered anomalous. The performance of the DBN based anomaly

  2. Bayesian approaches to spatial inference: Modelling and computational challenges and solutions

    Science.gov (United States)

    Moores, Matthew; Mengersen, Kerrie

    2014-12-01

    We discuss a range of Bayesian modelling approaches for spatial data and investigate some of the associated computational challenges. This paper commences with a brief review of Bayesian mixture models and Markov random fields, with enabling computational algorithms including Markov chain Monte Carlo (MCMC) and integrated nested Laplace approximation (INLA). Following this, we focus on the Potts model as a canonical approach, and discuss the challenge of estimating the inverse temperature parameter that controls the degree of spatial smoothing. We compare three approaches to addressing the doubly intractable nature of the likelihood, namely pseudo-likelihood, path sampling and the exchange algorithm. These techniques are applied to satellite data used to analyse water quality in the Great Barrier Reef.

  3. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    Science.gov (United States)

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way.

  4. Uncertainty estimation by Bayesian approach in thermochemical conversion of walnut hull and lignite coal blends.

    Science.gov (United States)

    Buyukada, Musa

    2017-05-01

    The main purpose of the present study was to incorporate the uncertainties in the thermal behavior of walnut hull (WH), lignite coal, and their various blends using Bayesian approach. First of all, thermal behavior of related materials were investigated under different temperatures, blend ratios, and heating rates. Results of ultimate and proximate analyses showed the main steps of oxidation mechanism of (co-)combustion process. Thermal degradation started with the (hemi-)cellulosic compounds and finished with lignin. Finally, a partial sensitivity analysis based on Bayesian approach (Markov Chain Monte Carlo simulations) were applied to data driven regression model (the best fit). The main purpose of uncertainty analysis was to point out the importance of operating conditions (explanatory variables). The other important aspect of the present work was the first performance evaluation study on various uncertainty estimation techniques in (co-)combustion literature.

  5. A Bayesian Approach to Multistage Fitting of the Variation of the Skeletal Age Features

    Directory of Open Access Journals (Sweden)

    Dong Hua

    2009-01-01

    Full Text Available Accurate assessment of skeletal maturity is important clinically. Skeletal age assessment is usually based on features encoded in ossification centers. Therefore, it is critical to design a mechanism to capture as much as possible characteristics of features. We have observed that given a feature, there exist stages of the skeletal age such that the variation pattern of the feature differs in these stages. Based on this observation, we propose a Bayesian cut fitting to describe features in response to the skeletal age. With our approach, appropriate positions for stage separation are determined automatically by a Bayesian approach, and a model is used to fit the variation of a feature within each stage. Our experimental results show that the proposed method surpasses the traditional fitting using only one line or one curve not only in the efficiency and accuracy of fitting but also in global and local feature characterization.

  6. Bayesian approach increases accuracy when selecting cowpea genotypes with high adaptability and phenotypic stability.

    Science.gov (United States)

    Barroso, L M A; Teodoro, P E; Nascimento, M; Torres, F E; Dos Santos, A; Corrêa, A M; Sagrilo, E; Corrêa, C C G; Silva, F A; Ceccon, G

    2016-03-11

    This study aimed to verify that a Bayesian approach could be used for the selection of upright cowpea genotypes with high adaptability and phenotypic stability, and the study also evaluated the efficiency of using informative and minimally informative a priori distributions. Six trials were conducted in randomized blocks, and the grain yield of 17 upright cowpea genotypes was assessed. To represent the minimally informative a priori distributions, a probability distribution with high variance was used, and a meta-analysis concept was adopted to represent the informative a priori distributions. Bayes factors were used to conduct comparisons between the a priori distributions. The Bayesian approach was effective for selection of upright cowpea genotypes with high adaptability and phenotypic stability using the Eberhart and Russell method. Bayes factors indicated that the use of informative a priori distributions provided more accurate results than minimally informative a priori distributions.

  7. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    Science.gov (United States)

    Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-09-01

    This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.

  8. Peering through a dirty window: A Bayesian approach to making mine detection decisions from noisy data

    Energy Technology Data Exchange (ETDEWEB)

    Kercel, Stephen W.

    1998-10-11

    For several reasons, Bayesian parameter estimation is superior to other methods for extracting features of a weak signal from noise. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since ''nuisance parameters'' can be dropped out of the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit for perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with the imperfections of real-world data, Bayesian approaches this ideal limit of performance more closely than other methods. A major unsolved problem in landmine detection is the fusion of data from multiple sensor types. Bayesian data fusion is only beginning to be explored as a solution to the problem. In single sensor processes Bayesian analysis can sense multiple parameters from the data stream of the one sensor. It does so by computing a joint probability density function of a set of parameter values from the sensor output. However, there is no inherent requirement that the information must come from a single sensor. If multiple sensors are applied to a single process, where several different parameters are implicit in each sensor output data stream, the joint probability density function of all the parameters of interest can be computed in exactly the same manner as the single sensor case. Thus, it is just as practical to base decisions on multiple sensor outputs as it is for single sensors. This should provide a practical way to combine the outputs of dissimilar sensors, such as ground penetrating radar and electromagnetic induction devices, producing a better detection decision than could be provided by either sensor alone.

  9. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.

    Directory of Open Access Journals (Sweden)

    Rowena Syn Yin Wong

    Full Text Available There are not many studies that attempt to model intensive care unit (ICU risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU.This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV model. Bayesian Markov Chain Monte Carlo (MCMC simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method.The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05 for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study.Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.

  10. Hybrid ICA-Bayesian network approach reveals distinct effective connectivity differences in schizophrenia.

    Science.gov (United States)

    Kim, D; Burge, J; Lane, T; Pearlson, G D; Kiehl, K A; Calhoun, V D

    2008-10-01

    We utilized a discrete dynamic Bayesian network (dDBN) approach (Burge, J., Lane, T., Link, H., Qiu, S., Clark, V.P., 2007. Discrete dynamic Bayesian network analysis of fMRI data. Hum Brain Mapp.) to determine differences in brain regions between patients with schizophrenia and healthy controls on a measure of effective connectivity, termed the approximate conditional likelihood score (ACL) (Burge, J., Lane, T., 2005. Learning Class-Discriminative Dynamic Bayesian Networks. Proceedings of the International Conference on Machine Learning, Bonn, Germany, pp. 97-104.). The ACL score represents a class-discriminative measure of effective connectivity by measuring the relative likelihood of the correlation between brain regions in one group versus another. The algorithm is capable of finding non-linear relationships between brain regions because it uses discrete rather than continuous values and attempts to model temporal relationships with a first-order Markov and stationary assumption constraint (Papoulis, A., 1991. Probability, random variables, and stochastic processes. McGraw-Hill, New York.). Since Bayesian networks are overly sensitive to noisy data, we introduced an independent component analysis (ICA) filtering approach that attempted to reduce the noise found in fMRI data by unmixing the raw datasets into a set of independent spatial component maps. Components that represented noise were removed and the remaining components reconstructed into the dimensions of the original fMRI datasets. We applied the dDBN algorithm to a group of 35 patients with schizophrenia and 35 matched healthy controls using an ICA filtered and unfiltered approach. We determined that filtering the data significantly improved the magnitude of the ACL score. Patients showed the greatest ACL scores in several regions, most markedly the cerebellar vermis and hemispheres. Our findings suggest that schizophrenia patients exhibit weaker connectivity than healthy controls in multiple regions

  11. Probabilistic rainfall thresholds for landslide occurrence using a Bayesian approach

    Science.gov (United States)

    Berti, M.; Martina, M.; Franceschini, S.; Pignone, S.; Simoni, A.; Pizziolo, M.

    2012-04-01

    Landslide rainfall thresholds are commonly defined as the critical value of two combined variables (e.g. rainfall duration and rainfall intensity) responsible for the occurrence of landslides in a given area. Various methods have been proposed in the literature to predict the rainfall conditions that are likely to trigger landslides, using for instance physically-based models or statistical analysis of historical catalogues. Most of these methods share an implicit deterministic view: the occurrence of landslides can be predicted by comparing the input value (rainfall conditions) with the threshold, and a single output (landslide or no-landslide) is only possible for a given input. In practical applications, however, a deterministic approach is not always applicable. Failure conditions are often achieved with a unique combination of many relevant factors (hydrologic response, weathering, changes in field stress, anthropic activity) and landslide triggering cannot be predicted by rainfall alone. When different outputs (landslide or no-landslide) can be obtained for the same input (rainfall conditions) a deterministic approach is no longer applicable and a probabilistic model is preferable. In this study we propose a new method to evaluate the rainfall thresholds based on Bayes probability. The method is simple, statistically rigorous, and provides a way to define thresholds in complex cases, when conventional approaches become highly subjective. The Bayes theorem is a direct application of conditional probabilities and it allows to computed the conditional probability to have a landslide (A) when a rainfall event of a given magnitude (B) is expected. The fundamental aspect of the Bayes approach is that the landslide probability P(A|B) depends not only on the observed probability of the triggering rainfall P(B|A), but also on the marginal probability of the expected rainfall event P(B). Therefore, both the rainfall that resulted in landslides and the rainfall that not

  12. IONONEST—A Bayesian approach to modeling the lower ionosphere

    Science.gov (United States)

    Martin, Poppy L.; Scaife, Anna M. M.; McKay, Derek; McCrea, Ian

    2016-08-01

    Obtaining high-resolution electron density height profiles for the D region of the ionosphere as a well-sampled function of time is difficult for most methods of ionospheric measurement. Here we present a new method of using multifrequency riometry data for producing D region height profiles via inverse methods. To obtain these profiles, we use the nested sampling technique, implemented through our code, IONONEST. We demonstrate this approach using new data from the Kilpisjärvi Atmospheric Imaging Receiver Array (KAIRA) instrument and consider two electron density models. We compare the recovered height profiles from the KAIRA data with those from incoherent scatter radar using data from the European Incoherent Scatter Facility (EISCAT) instrument and find that there is good agreement between the two techniques, allowing for instrumental differences.

  13. A Defence of the AR4’s Bayesian Approach to Quantifying Uncertainty

    Science.gov (United States)

    Vezer, M. A.

    2009-12-01

    The field of climate change research is a kimberlite pipe filled with philosophic diamonds waiting to be mined and analyzed by philosophers. Within the scientific literature on climate change, there is much philosophical dialogue regarding the methods and implications of climate studies. To this date, however, discourse regarding the philosophy of climate science has been confined predominately to scientific - rather than philosophical - investigations. In this paper, I hope to bring one such issue to the surface for explicit philosophical analysis: The purpose of this paper is to address a philosophical debate pertaining to the expressions of uncertainty in the International Panel on Climate Change (IPCC) Fourth Assessment Report (AR4), which, as will be noted, has received significant attention in scientific journals and books, as well as sporadic glances from the popular press. My thesis is that the AR4’s Bayesian method of uncertainty analysis and uncertainty expression is justifiable on pragmatic grounds: it overcomes problems associated with vagueness, thereby facilitating communication between scientists and policy makers such that the latter can formulate decision analyses in response to the views of the former. Further, I argue that the most pronounced criticisms against the AR4’s Bayesian approach, which are outlined below, are misguided. §1 Introduction Central to AR4 is a list of terms related to uncertainty that in colloquial conversations would be considered vague. The IPCC attempts to reduce the vagueness of its expressions of uncertainty by calibrating uncertainty terms with numerical probability values derived from a subjective Bayesian methodology. This style of analysis and expression has stimulated some controversy, as critics reject as inappropriate and even misleading the association of uncertainty terms with Bayesian probabilities. [...] The format of the paper is as follows. The investigation begins (§2) with an explanation of

  14. Guided wave-based identification of multiple cracks in beams using a Bayesian approach

    Science.gov (United States)

    He, Shuai; Ng, Ching-Tai

    2017-02-01

    A guided wave damage identification method using a model-based approach is proposed to identify multiple cracks in beam-like structures. The guided wave propagation is simulated using spectral finite element method and a crack element is proposed to take into account the mode conversion effect. The Bayesian model class selection algorithm is employed to determine the crack number and then the Bayesian statistical framework is used to identify the crack parameters and the associated uncertainties. In order to improve the efficiency and ensure the reliability of identification, the Transitional Markov Chain Monte Carlo (TMCMC) method is implemented in the Bayesian approach. A series of numerical case studies are carried out to assess the performance of the proposed method, in which the sensitivity of different guided wave modes and effect of different levels of measurement noise in identifying different numbers of cracks is studied in detail. The proposed method is also experimentally verified using guided wave data obtained from laser vibrometer. The results show that the proposed method is able to accurately identify the number, locations and sizes of the cracks, and also quantify the associated uncertainties. In addition the proposed method is robust under measurement noise and different situations of the cracks.

  15. A Bayesian Approach for Localization of Acoustic Emission Source in Plate-Like Structures

    Directory of Open Access Journals (Sweden)

    Gang Yan

    2015-01-01

    Full Text Available This paper presents a Bayesian approach for localizing acoustic emission (AE source in plate-like structures with consideration of uncertainties from modeling error and measurement noise. A PZT sensor network is deployed to monitor and acquire AE wave signals released by possible damage. By using continuous wavelet transform (CWT, the time-of-flight (TOF information of the AE wave signals is extracted and measured. With a theoretical TOF model, a Bayesian parameter identification procedure is developed to obtain the AE source location and the wave velocity at a specific frequency simultaneously and meanwhile quantify their uncertainties. It is based on Bayes’ theorem that the posterior distributions of the parameters about the AE source location and the wave velocity are obtained by relating their priors and the likelihood of the measured time difference data. A Markov chain Monte Carlo (MCMC algorithm is employed to draw samples to approximate the posteriors. Also, a data fusion scheme is performed to fuse results identified at multiple frequencies to increase accuracy and reduce uncertainty of the final localization results. Experimental studies on a stiffened aluminum panel with simulated AE events by pensile lead breaks (PLBs are conducted to validate the proposed Bayesian AE source localization approach.

  16. Textual and visual content-based anti-phishing: a Bayesian approach.

    Science.gov (United States)

    Zhang, Haijun; Liu, Gang; Chow, Tommy W S; Liu, Wenyin

    2011-10-01

    A novel framework using a Bayesian approach for content-based phishing web page detection is presented. Our model takes into account textual and visual contents to measure the similarity between the protected web page and suspicious web pages. A text classifier, an image classifier, and an algorithm fusing the results from classifiers are introduced. An outstanding feature of this paper is the exploration of a Bayesian model to estimate the matching threshold. This is required in the classifier for determining the class of the web page and identifying whether the web page is phishing or not. In the text classifier, the naive Bayes rule is used to calculate the probability that a web page is phishing. In the image classifier, the earth mover's distance is employed to measure the visual similarity, and our Bayesian model is designed to determine the threshold. In the data fusion algorithm, the Bayes theory is used to synthesize the classification results from textual and visual content. The effectiveness of our proposed approach was examined in a large-scale dataset collected from real phishing cases. Experimental results demonstrated that the text classifier and the image classifier we designed deliver promising results, the fusion algorithm outperforms either of the individual classifiers, and our model can be adapted to different phishing cases.

  17. Modeling uncertainties in estimation of canopy LAI from hyperspectral remote sensing data - A Bayesian approach

    Science.gov (United States)

    Varvia, Petri; Rautiainen, Miina; Seppänen, Aku

    2017-04-01

    Hyperspectral remote sensing data carry information on the leaf area index (LAI) of forests, and thus in principle, LAI can be estimated based on the data by inverting a forest reflectance model. However, LAI is usually not the only unknown in a reflectance model; especially, the leaf spectral albedo and understory reflectance are also not known. If the uncertainties of these parameters are not accounted for, the inversion of a forest reflectance model can lead to biased estimates for LAI. In this paper, we study the effects of reflectance model uncertainties on LAI estimates, and further, investigate whether the LAI estimates could recover from these uncertainties with the aid of Bayesian inference. In the proposed approach, the unknown leaf albedo and understory reflectance are estimated simultaneously with LAI from hyperspectral remote sensing data. The feasibility of the approach is tested with numerical simulation studies. The results show that in the presence of unknown parameters, the Bayesian LAI estimates which account for the model uncertainties outperform the conventional estimates that are based on biased model parameters. Moreover, the results demonstrate that the Bayesian inference can also provide feasible measures for the uncertainty of the estimated LAI.

  18. Understanding the formation and evolution of interstellar ices: a Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Makrymallis, Antonios; Viti, Serena, E-mail: antonios@star.ucl.ac.uk [Department of Physics and Astronomy, University College London, London WC1E 6BT (United Kingdom)

    2014-10-10

    Understanding the physical conditions of dark molecular clouds and star-forming regions is an inverse problem subject to complicated chemistry that varies nonlinearly with both time and the physical environment. In this paper, we apply a Bayesian approach based on a Markov chain Monte Carlo (MCMC) method for solving the nonlinear inverse problems encountered in astrochemical modeling. We use observations for ice and gas species in dark molecular clouds and a time-dependent, gas-grain chemical model to infer the values of the physical and chemical parameters that characterize quiescent regions of molecular clouds. We show evidence that in high-dimensional problems, MCMC algorithms provide a more efficient and complete solution than more classical strategies. The results of our MCMC method enable us to derive statistical estimates and uncertainties for the physical parameters of interest as a result of the Bayesian treatment.

  19. Evaluating a Bayesian approach to improve accuracy of individual photographic identification methods using ecological distribution data

    Directory of Open Access Journals (Sweden)

    Richard Stafford

    2011-04-01

    Full Text Available Photographic identification of individual organisms can be possible from natural body markings. Data from photo-ID can be used to estimate important ecological and conservation metrics such as population sizes, home ranges or territories. However, poor quality photographs or less well-studied individuals can result in a non-unique ID, potentially confounding several similar looking individuals. Here we present a Bayesian approach that uses known data about previous sightings of individuals at specific sites as priors to help assess the problems of obtaining a non-unique ID. Using a simulation of individuals with different confidence of correct ID we evaluate the accuracy of Bayesian modified (posterior probabilities. However, in most cases, the accuracy of identification decreases. Although this technique is unsuccessful, it does demonstrate the importance of computer simulations in testing such hypotheses in ecology.

  20. A Bayesian approach to the semi-analytic model of galaxy formation: methodology

    CERN Document Server

    Lu, Yu; Weinberg, Martin D; Katz, Neal S

    2010-01-01

    We believe that a wide range of physical processes conspire to shape the observed galaxy population but we remain unsure of their detailed interactions. The semi-analytic model (SAM) of galaxy formation uses multi-dimensional parameterizations of the physical processes of galaxy formation and provides a tool to constrain these underlying physical interactions. Because of the high dimensionality, the parametric problem of galaxy formation may be profitably tackled with a Bayesian-inference based approach, which allows one to constrain theory with data in a statistically rigorous way. In this paper, we develop a generalized SAM using the framework of Bayesian inference. We show that, with a parallel implementation of an advanced Markov-Chain Monte-Carlo algorithm, it is now possible to rigorously sample the posterior distribution of the high-dimensional parameter space of typical SAMs. As an example, we characterize galaxy formation in the current $\\Lambda$CDM cosmology using stellar mass function of galaxies a...

  1. The Appeal to Expert Opinion: Quantitative Support for a Bayesian Network Approach.

    Science.gov (United States)

    Harris, Adam J L; Hahn, Ulrike; Madsen, Jens K; Hsu, Anne S

    2016-08-01

    The appeal to expert opinion is an argument form that uses the verdict of an expert to support a position or hypothesis. A previous scheme-based treatment of the argument form is formalized within a Bayesian network that is able to capture the critical aspects of the argument form, including the central considerations of the expert's expertise and trustworthiness. We propose this as an appropriate normative framework for the argument form, enabling the development and testing of quantitative predictions as to how people evaluate this argument, suggesting that such an approach might be beneficial to argumentation research generally. We subsequently present two experiments as an example of the potential for future research in this vein, demonstrating that participants' quantitative ratings of the convincingness of a proposition that has been supported with an appeal to expert opinion were broadly consistent with the predictions of the Bayesian model.

  2. A Bayesian Approach for Parameter Estimation and Prediction using a Computationally Intensive Model

    CERN Document Server

    Higdon, Dave; Schunck, Nicolas; Sarich, Jason; Wild, Stefan M

    2014-01-01

    Bayesian methods have been very successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based model $\\eta(\\theta)$ where $\\theta$ denotes the uncertain, best input setting. Hence the statistical model is of the form $y = \\eta(\\theta) + \\epsilon$, where $\\epsilon$ accounts for measurement, and possibly other error sources. When non-linearity is present in $\\eta(\\cdot)$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and non-standard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. While quite generally applicable, MCMC requires thousands, or even millions of evaluations of the physics model $\\eta(\\cdot)$. This is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we pr...

  3. Nuclear Mass Predictions for the Crustal Composition of Neutron Stars: A Bayesian Neural Network Approach

    CERN Document Server

    Utama, R; Prosper, H B

    2016-01-01

    Besides their intrinsic nuclear-structure value, nuclear mass models are essential for astrophysical applications, such as r-process nucleosynthesis and neutron-star structure. To overcome the intrinsic limitations of existing "state-of-the-art" mass models, we propose a refinement based on a Bayesian Neural Network (BNN) formalism. A novel BNN approach is implemented with the goal of optimizing mass residuals between theory and experiment. A significant improvement (of about 40%) in the mass predictions of existing models is obtained after BNN refinement. Moreover, these improved results are now accompanied by proper statistical errors. Finally, by constructing a "world average" of these predictions, a mass model is obtained that is used to predict the composition of the outer crust of a neutron star. The power of the Bayesian neural network method has been successfully demonstrated by a systematic improvement in the accuracy of the predictions of nuclear masses. Extension to other nuclear observables is a n...

  4. Communication: A multiscale Bayesian inference approach to analyzing subdiffusion in particle trajectories.

    Science.gov (United States)

    Hinsen, Konrad; Kneller, Gerald R

    2016-10-21

    Anomalous diffusion is characterized by its asymptotic behavior for t → ∞. This makes it difficult to detect and describe in particle trajectories from experiments or computer simulations, which are necessarily of finite length. We propose a new approach using Bayesian inference applied directly to the observed trajectories sampled at different time scales. We illustrate the performance of this approach using random trajectories with known statistical properties and then use it for analyzing the motion of lipid molecules in the plane of a lipid bilayer.

  5. Calibration of crash risk models on freeways with limited real-time traffic data using Bayesian meta-analysis and Bayesian inference approach.

    Science.gov (United States)

    Xu, Chengcheng; Wang, Wei; Liu, Pan; Li, Zhibin

    2015-12-01

    This study aimed to develop a real-time crash risk model with limited data in China by using Bayesian meta-analysis and Bayesian inference approach. A systematic review was first conducted by using three different Bayesian meta-analyses, including the fixed effect meta-analysis, the random effect meta-analysis, and the meta-regression. The meta-analyses provided a numerical summary of the effects of traffic variables on crash risks by quantitatively synthesizing results from previous studies. The random effect meta-analysis and the meta-regression produced a more conservative estimate for the effects of traffic variables compared with the fixed effect meta-analysis. Then, the meta-analyses results were used as informative priors for developing crash risk models with limited data. Three different meta-analyses significantly affect model fit and prediction accuracy. The model based on meta-regression can increase the prediction accuracy by about 15% as compared to the model that was directly developed with limited data. Finally, the Bayesian predictive densities analysis was used to identify the outliers in the limited data. It can further improve the prediction accuracy by 5.0%.

  6. A Bayesian approach to multiscale inverse problems with on-the-fly scale determination

    Science.gov (United States)

    Ellam, Louis; Zabaras, Nicholas; Girolami, Mark

    2016-12-01

    A Bayesian computational approach is presented to provide a multi-resolution estimate of an unknown spatially varying parameter from indirect measurement data. In particular, we are interested in spatially varying parameters with multiscale characteristics. In our work, we consider the challenge of not knowing the characteristic length scale(s) of the unknown a priori, and present an algorithm for on-the-fly scale determination. Our approach is based on representing the spatial field with a wavelet expansion. Wavelet basis functions are hierarchically structured, localized in both spatial and frequency domains and tend to provide sparse representations in that a large number of wavelet coefficients are approximately zero. For these reasons, wavelet bases are suitable for representing permeability fields with non-trivial correlation structures. Moreover, the intra-scale correlations between wavelet coefficients form a quadtree, and this structure is exploited to identify additional basis functions to refine the model. Bayesian inference is performed using a sequential Monte Carlo (SMC) sampler with a Markov Chain Monte Carlo (MCMC) transition kernel. The SMC sampler is used to move between posterior densities defined on different scales, thereby providing a computationally efficient method for adaptive refinement of the wavelet representation. We gain insight from the marginal likelihoods, by computing Bayes factors, for model comparison and model selection. The marginal likelihoods provide a termination criterion for our scale determination algorithm. The Bayesian computational approach is rather general and applicable to several inverse problems concerning the estimation of a spatially varying parameter. The approach is demonstrated with permeability estimation for groundwater flow using pressure sensor measurements.

  7. A Fully Bayesian Approach to Improved Calibration and Prediction of Groundwater Models With Structure Error

    Science.gov (United States)

    Xu, T.; Valocchi, A. J.

    2014-12-01

    Effective water resource management typically relies on numerical models to analyse groundwater flow and solute transport processes. These models are usually subject to model structure error due to simplification and/or misrepresentation of the real system. As a result, the model outputs may systematically deviate from measurements, thus violating a key assumption for traditional regression-based calibration and uncertainty analysis. On the other hand, model structure error induced bias can be described statistically in an inductive, data-driven way based on historical model-to-measurement misfit. We adopt a fully Bayesian approach that integrates a Gaussian process error model to account for model structure error to the calibration, prediction and uncertainty analysis of groundwater models. The posterior distributions of parameters of the groundwater model and the Gaussian process error model are jointly inferred using DREAM, an efficient Markov chain Monte Carlo sampler. We test the usefulness of the fully Bayesian approach towards a synthetic case study of surface-ground water interaction under changing pumping conditions. We first illustrate through this example that traditional least squares regression without accounting for model structure error yields biased parameter estimates due to parameter compensation as well as biased predictions. In contrast, the Bayesian approach gives less biased parameter estimates. Moreover, the integration of a Gaussian process error model significantly reduces predictive bias and leads to prediction intervals that are more consistent with observations. The results highlight the importance of explicit treatment of model structure error especially in circumstances where subsequent decision-making and risk analysis require accurate prediction and uncertainty quantification. In addition, the data-driven error modelling approach is capable of extracting more information from observation data than using a groundwater model alone.

  8. Capturing changes in flood risk with Bayesian approaches for flood damage assessment

    Science.gov (United States)

    Vogel, Kristin; Schröter, Kai; Kreibich, Heidi; Thieken, Annegret; Müller, Meike; Sieg, Tobias; Laudan, Jonas; Kienzler, Sarah; Weise, Laura; Merz, Bruno; Scherbaum, Frank

    2016-04-01

    Flood risk is a function of hazard as well as of exposure and vulnerability. All three components are under change over space and time and have to be considered for reliable damage estimations and risk analyses, since this is the basis for an efficient, adaptable risk management. Hitherto, models for estimating flood damage are comparatively simple and cannot sufficiently account for changing conditions. The Bayesian network approach allows for a multivariate modeling of complex systems without relying on expert knowledge about physical constraints. In a Bayesian network each model component is considered to be a random variable. The way of interactions between those variables can be learned from observations or be defined by expert knowledge. Even a combination of both is possible. Moreover, the probabilistic framework captures uncertainties related to the prediction and provides a probability distribution for the damage instead of a point estimate. The graphical representation of Bayesian networks helps to study the change of probabilities for changing circumstances and may thus simplify the communication between scientists and public authorities. In the framework of the DFG-Research Training Group "NatRiskChange" we aim to develop Bayesian networks for flood damage and vulnerability assessments of residential buildings and companies under changing conditions. A Bayesian network learned from data, collected over the last 15 years in flooded regions in the Elbe and Danube catchments (Germany), reveals the impact of many variables like building characteristics, precaution and warning situation on flood damage to residential buildings. While the handling of incomplete and hybrid (discrete mixed with continuous) data are the most challenging issues in the study on residential buildings, a similar study, that focuses on the vulnerability of small to medium sized companies, bears new challenges. Relying on a much smaller data set for the determination of the model

  9. [Overcoming the limitations of the descriptive and categorical approaches in psychiatric diagnosis: a proposal based on Bayesian networks].

    Science.gov (United States)

    Sorias, Soli

    2015-01-01

    Efforts to overcome the problems of descriptive and categorical approaches have not yielded results. In the present article, psychiatric diagnosis using Bayesian networks is proposed. Instead of a yes/no decision, Bayesian networks give the probability of diagnostic category inclusion, thereby yielding both a graded, i.e., dimensional diagnosis, and a value of the certainty of the diagnosis. With the use of Bayesian networks in the diagnosis of mental disorders, information about etiology, associated features, treatment outcome, and laboratory results may be used in addition to clinical signs and symptoms, with each of these factors contributing proportionally to their own specificity and sensitivity. Furthermore, a diagnosis (albeit one with a lower probability) can be made even with incomplete, uncertain, or partially erroneous information, and patients whose symptoms are below the diagnostic threshold can be evaluated. Lastly, there is no need of NOS or "unspecified" categories, and comorbid disorders become different dimensions of the diagnostic evaluation. Bayesian diagnoses allow the preservation of current categories and assessment methods, and may be used concurrently with criteria-based diagnoses. Users need not put in extra effort except to collect more comprehensive information. Unlike the Research Domain Criteria (RDoC) project, the Bayesian approach neither increases the diagnostic validity of existing categories nor explains the pathophysiological mechanisms of mental disorders. It, however, can be readily integrated to present classification systems. Therefore, the Bayesian approach may be an intermediate phase between criteria-based diagnosis and the RDoC ideal.

  10. True versus apparent malaria infection prevalence: the contribution of a Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Niko Speybroeck

    Full Text Available AIMS: To present a new approach for estimating the "true prevalence" of malaria and apply it to datasets from Peru, Vietnam, and Cambodia. METHODS: Bayesian models were developed for estimating both the malaria prevalence using different diagnostic tests (microscopy, PCR & ELISA, without the need of a gold standard, and the tests' characteristics. Several sources of information, i.e. data, expert opinions and other sources of knowledge can be integrated into the model. This approach resulting in an optimal and harmonized estimate of malaria infection prevalence, with no conflict between the different sources of information, was tested on data from Peru, Vietnam and Cambodia. RESULTS: Malaria sero-prevalence was relatively low in all sites, with ELISA showing the highest estimates. The sensitivity of microscopy and ELISA were statistically lower in Vietnam than in the other sites. Similarly, the specificities of microscopy, ELISA and PCR were significantly lower in Vietnam than in the other sites. In Vietnam and Peru, microscopy was closer to the "true" estimate than the other 2 tests while as expected ELISA, with its lower specificity, usually overestimated the prevalence. CONCLUSIONS: Bayesian methods are useful for analyzing prevalence results when no gold standard diagnostic test is available. Though some results are expected, e.g. PCR more sensitive than microscopy, a standardized and context-independent quantification of the diagnostic tests' characteristics (sensitivity and specificity and the underlying malaria prevalence may be useful for comparing different sites. Indeed, the use of a single diagnostic technique could strongly bias the prevalence estimation. This limitation can be circumvented by using a Bayesian framework taking into account the imperfect characteristics of the currently available diagnostic tests. As discussed in the paper, this approach may further support global malaria burden estimation initiatives.

  11. Comparison of dynamic Bayesian network approaches for online diagnosis of aircraft system

    Institute of Scientific and Technical Information of China (English)

    于劲松; 冯威; 唐荻音; 刘浩

    2016-01-01

    The online diagnosis for aircraft system has always been a difficult problem. This is due to time evolution of system change, uncertainty of sensor measurements, and real-time requirement of diagnostic inference. To address this problem, two dynamic Bayesian network (DBN) approaches are proposed. One approach prunes the DBN of system, and then uses particle filter (PF) for this pruned DBN (PDBN) to perform online diagnosis. The problem is that estimates from a PF tend to have high variance for small sample sets. Using large sample sets is computationally expensive. The other approach compiles the PDBN into a dynamic arithmetic circuit (DAC) using an offline procedure that is applied only once, and then uses this circuit to provide online diagnosis recursively. This approach leads to the most computational consumption in the offline procedure. The experimental results show that the DAC, compared with the PF for PDBN, not only provides more reliable online diagnosis, but also offers much faster inference.

  12. Introducing a Bayesian Approach to Determining Degree of Fit With Existing Rorschach Norms.

    Science.gov (United States)

    Giromini, Luciano; Viglione, Donald J; McCullaugh, Joseph

    2015-01-01

    This article offers a new methodological approach to investigate the degree of fit between an independent sample and 2 existing sets of norms. Specifically, with a new adaptation of a Bayesian method, we developed a user-friendly procedure to compare the mean values of a given sample to those of 2 different sets of Rorschach norms. To illustrate our technique, we used a small, U.S. community sample of 80 adults and tested whether it resembled more closely the standard Comprehensive System norms (CS 600; Exner, 2003), or a recently introduced, internationally based set of Rorschach norms (Meyer, Erdberg, & Shaffer, 2007 ). Strengths and limitations of this new statistical technique are discussed.

  13. A Pseudo-Bayesian Approach to Sign-Compute-Resolve Slotted ALOHA

    DEFF Research Database (Denmark)

    Goseling, Jasper; Stefanovic, Cedomir; Popovski, Petar

    2015-01-01

    Access reservation based on slotted ALOHA is commonly used in wireless cellular access. In this paper we investigate its enhancement based on the use of physical-layer network coding and signature coding, whose main feature is enabling simultaneous resolution of up to K users contending for access......, where K ≥ 1. We optimise the slot access probability such that the expected throughput is maximised. In particular, the slot access probability is chosen in line with an estimate of the number of users in the system that is obtained relying on the pseudo-Bayesian approach by Rivest, which we generalise...

  14. A new approach for supply chain risk management: Mapping SCOR into Bayesian network

    Directory of Open Access Journals (Sweden)

    Mahdi Abolghasemi

    2015-01-01

    Full Text Available Purpose: Increase of costs and complexities in organizations beside the increase of uncertainty and risks have led the managers to use the risk management in order to decrease risk taking and deviation from goals. SCRM has a close relationship with supply chain performance. During the years different methods have been used by researchers in order to manage supply chain risk but most of them are either qualitative or quantitative. Supply chain operation reference (SCOR is a standard model for SCP evaluation which have uncertainty in its metrics. In This paper by combining qualitative and quantitative metrics of SCOR, supply chain performance will be measured by Bayesian Networks. Design/methodology/approach: First qualitative assessment will be done by recognizing uncertain metrics of SCOR model and then by quantifying them, supply chain performance will be measured by Bayesian Networks (BNs and supply chain operations reference (SCOR in which making decision on uncertain variables will be done by predictive and diagnostic capabilities. Findings: After applying the proposed method in one of the biggest automotive companies in Iran, we identified key factors of supply chain performance based on SCOR model through predictive and diagnostic capability of Bayesian Networks. After sensitivity analysis, we find out that ‘Total cost’ and its criteria that include costs of labors, warranty, transportation and inventory have the widest range and most effect on supply chain performance. So, managers should take their importance into account for decision making. We can make decisions simply by running model in different situations. Research limitations/implications: A more precise model consisted of numerous factors but it is difficult and sometimes impossible to solve big models, if we insert all of them in a Bayesian model. We have adopted real world characteristics with our software and method abilities. On the other hand, fewer data exist for some

  15. An approach based on Hierarchical Bayesian Graphical Models for measurement interpretation under uncertainty

    Science.gov (United States)

    Skataric, Maja; Bose, Sandip; Zeroug, Smaine; Tilke, Peter

    2017-02-01

    It is not uncommon in the field of non-destructive evaluation that multiple measurements encompassing a variety of modalities are available for analysis and interpretation for determining the underlying states of nature of the materials or parts being tested. Despite and sometimes due to the richness of data, significant challenges arise in the interpretation manifested as ambiguities and inconsistencies due to various uncertain factors in the physical properties (inputs), environment, measurement device properties, human errors, and the measurement data (outputs). Most of these uncertainties cannot be described by any rigorous mathematical means, and modeling of all possibilities is usually infeasible for many real time applications. In this work, we will discuss an approach based on Hierarchical Bayesian Graphical Models (HBGM) for the improved interpretation of complex (multi-dimensional) problems with parametric uncertainties that lack usable physical models. In this setting, the input space of the physical properties is specified through prior distributions based on domain knowledge and expertise, which are represented as Gaussian mixtures to model the various possible scenarios of interest for non-destructive testing applications. Forward models are then used offline to generate the expected distribution of the proposed measurements which are used to train a hierarchical Bayesian network. In Bayesian analysis, all model parameters are treated as random variables, and inference of the parameters is made on the basis of posterior distribution given the observed data. Learned parameters of the posterior distribution obtained after the training can therefore be used to build an efficient classifier for differentiating new observed data in real time on the basis of pre-trained models. We will illustrate the implementation of the HBGM approach to ultrasonic measurements used for cement evaluation of cased wells in the oil industry.

  16. Study on mapping Quantitative Trait Loci for animal complex binary traits using Bayesian-Markov chain Monte Carlo approach

    Institute of Scientific and Technical Information of China (English)

    LIU; Jianfeng; ZHANG; Yuan; ZHANG; Qin; WANG; Lixian; ZHANG; Jigang

    2006-01-01

    It is a challenging issue to map Quantitative Trait Loci (QTL) underlying complex discrete traits, which usually show discontinuous distribution and less information, using conventional statistical methods. Bayesian-Markov chain Monte Carlo (Bayesian-MCMC) approach is the key procedure in mapping QTL for complex binary traits, which provides a complete posterior distribution for QTL parameters using all prior information. As a consequence, Bayesian estimates of all interested variables can be obtained straightforwardly basing on their posterior samples simulated by the MCMC algorithm. In our study, utilities of Bayesian-MCMC are demonstrated using simulated several animal outbred full-sib families with different family structures for a complex binary trait underlied by both a QTL and polygene. Under the Identity-by-Descent-Based variance component random model, three samplers basing on MCMC, including Gibbs sampling, Metropolis algorithm and reversible jump MCMC, were implemented to generate the joint posterior distribution of all unknowns so that the QTL parameters were obtained by Bayesian statistical inferring. The results showed that Bayesian-MCMC approach could work well and robust under different family structures and QTL effects. As family size increases and the number of family decreases, the accuracy of the parameter estimates will be improved. When the true QTL has a small effect, using outbred population experiment design with large family size is the optimal mapping strategy.

  17. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall

    2016-01-01

    Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973

  18. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the

  19. Profile-Based LC-MS data alignment--a Bayesian approach.

    Science.gov (United States)

    Tsai, Tsung-Heng; Tadesse, Mahlet G; Wang, Yue; Ressom, Habtom W

    2013-01-01

    A Bayesian alignment model (BAM) is proposed for alignment of liquid chromatography-mass spectrometry (LC-MS) data. BAM belongs to the category of profile-based approaches, which are composed of two major components: a prototype function and a set of mapping functions. Appropriate estimation of these functions is crucial for good alignment results. BAM uses Markov chain Monte Carlo (MCMC) methods to draw inference on the model parameters and improves on existing MCMC-based alignment methods through 1) the implementation of an efficient MCMC sampler and 2) an adaptive selection of knots. A block Metropolis-Hastings algorithm that mitigates the problem of the MCMC sampler getting stuck at local modes of the posterior distribution is used for the update of the mapping function coefficients. In addition, a stochastic search variable selection (SSVS) methodology is used to determine the number and positions of knots. We applied BAM to a simulated data set, an LC-MS proteomic data set, and two LC-MS metabolomic data sets, and compared its performance with the Bayesian hierarchical curve registration (BHCR) model, the dynamic time-warping (DTW) model, and the continuous profile model (CPM). The advantage of applying appropriate profile-based retention time correction prior to performing a feature-based approach is also demonstrated through the metabolomic data sets.

  20. A hierarchical Bayesian approach for reconstructing the initial mass function of single stellar populations

    Science.gov (United States)

    Dries, M.; Trager, S. C.; Koopmans, L. V. E.

    2016-11-01

    Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach, we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities and IMFs. When systematic uncertainties are not significant, we are able to reconstruct the input parameters that were used to create the mock populations. Our results show that if systematic uncertainties do play a role, this may introduce a bias on the results. Therefore, it is important to objectively compare different ingredients of SPS models. Through its Bayesian framework, our model is well suited for this.

  1. Denoising peptide tandem mass spectra for spectral libraries: a Bayesian approach.

    Science.gov (United States)

    Shao, Wenguang; Lam, Henry

    2013-07-05

    With the rapid accumulation of data from shotgun proteomics experiments, it has become feasible to build comprehensive and high-quality spectral libraries of tandem mass spectra of peptides. A spectral library condenses experimental data into a retrievable format and can be used to aid peptide identification by spectral library searching. A key step in spectral library building is spectrum denoising, which is best accomplished by merging multiple replicates of the same peptide ion into a consensus spectrum. However, this approach cannot be applied to "singleton spectra," for which only one observed spectrum is available for the peptide ion. We developed a method, based on a Bayesian classifier, for denoising peptide tandem mass spectra. The classifier accounts for relationships between peaks, and can be trained on the fly from consensus spectra and immediately applied to denoise singleton spectra, without hard-coded knowledge about peptide fragmentation. A linear regression model was also trained to predict the number of useful "signal" peaks in a spectrum, thereby obviating the need for arbitrary thresholds for peak filtering. This Bayesian approach accumulates weak evidence systematically to boost the discrimination power between signal and noise peaks, and produces readily interpretable conditional probabilities that offer valuable insights into peptide fragmentation behaviors. By cross validation, spectra denoised by this method were shown to retain more signal peaks, and have higher spectral similarities to replicates, than those filtered by intensity only.

  2. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    Science.gov (United States)

    Raj, Raghu G.

    2016-07-01

    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  3. Cassini Radio Occultation of Saturn's Rings: a Bayesian Approach to Particle Size Distribution Recovery

    Science.gov (United States)

    Wong, K. K.; Marouf, E. A.

    2004-12-01

    The radio occultation technique was first used to study Saturn's rings through their effects on quasi-monochromatic radio signals transmitted from Voyager 1 during its flyby of Saturn in 1980. Almost a quarter of a century later, Cassini is planned to conduct a more extensive set of radio occultation experiments during its tour of the Saturn system. Cassini enjoys the advantage of a wide range of ring viewing geometry as well as the unique new capability of simultaneously transmitting 0.94, 3.6 and 13 cm-wavelength coherent radio signals (Ka-, X-, and S-band, respectively). Observed extinction of the direct signal and time-sequence spectra (spectrogram) of the near-forward scattered signal can be used to infer the size distribution of particles of resolved ring features (among other objectives). The inference requires solving three distinct inversion problems to recover from the measurements: i) the multiply-scattered collective diffraction lobe of a resolved ring feature, ii) the first-order scattering contribution to the collective lobe, and iii) the corresponding particle size distribution. Although various classical regularization techniques may be used for this purpose, a subjective valuation of solution smoothness usually needs to be introduced. We investigate an alternative approach based on Bayesian function learning schemes which provides a rigorous probabilistic framework to address the tradeoff between data fit residuals and prior knowledge about the character of the solution. In contrast with the regularization approach, the Bayesian approach provides estimates of confidence intervals for the most-likely solution achieved, an important advantage. The approach is particularly adaptable to some Cassini occultations of relatively unfavorable alignment between contours of constant Doppler shift in the ring plane and circular boundaries of ring features, as the approach naturally "fuses" time-sequence of spectra each containing contributions from adjacent

  4. Bayesian and Empirical Bayes Approaches to Setting Passing Scores on Mastery Tests. Publication Series in Mastery Testing.

    Science.gov (United States)

    Huynh, Huynh; Saunders, Joseph C., III

    The Bayesian approach to setting passing scores, as proposed by Swaminathan, Hambleton, and Algina, is compared with the empirical Bayes approach to the same problem that is derived from Huynh's decision-theoretic framework. Comparisons are based on simulated data which follow an approximate beta-binomial distribution and on real test results from…

  5. Adaptive Methods within a Sequential Bayesian Approach for Structural Health Monitoring

    Science.gov (United States)

    Huff, Daniel W.

    Structural integrity is an important characteristic of performance for critical components used in applications such as aeronautics, materials, construction and transportation. When appraising the structural integrity of these components, evaluation methods must be accurate. In addition to possessing capability to perform damage detection, the ability to monitor the level of damage over time can provide extremely useful information in assessing the operational worthiness of a structure and in determining whether the structure should be repaired or removed from service. In this work, a sequential Bayesian approach with active sensing is employed for monitoring crack growth within fatigue-loaded materials. The monitoring approach is based on predicting crack damage state dynamics and modeling crack length observations. Since fatigue loading of a structural component can change while in service, an interacting multiple model technique is employed to estimate probabilities of different loading modes and incorporate this information in the crack length estimation problem. For the observation model, features are obtained from regions of high signal energy in the time-frequency plane and modeled for each crack length damage condition. Although this observation model approach exhibits high classification accuracy, the resolution characteristics can change depending upon the extent of the damage. Therefore, several different transmission waveforms and receiver sensors are considered to create multiple modes for making observations of crack damage. Resolution characteristics of the different observation modes are assessed using a predicted mean squared error criterion and observations are obtained using the predicted, optimal observation modes based on these characteristics. Calculation of the predicted mean square error metric can be computationally intensive, especially if performed in real time, and an approximation method is proposed. With this approach, the real time

  6. Précis of bayesian rationality: The probabilistic approach to human reasoning.

    Science.gov (United States)

    Oaksford, Mike; Chater, Nick

    2009-02-01

    According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.

  7. Crash risk analysis for Shanghai urban expressways: A Bayesian semi-parametric modeling approach.

    Science.gov (United States)

    Yu, Rongjie; Wang, Xuesong; Yang, Kui; Abdel-Aty, Mohamed

    2016-10-01

    Urban expressway systems have been developed rapidly in recent years in China; it has become one key part of the city roadway networks as carrying large traffic volume and providing high traveling speed. Along with the increase of traffic volume, traffic safety has become a major issue for Chinese urban expressways due to the frequent crash occurrence and the non-recurrent congestions caused by them. For the purpose of unveiling crash occurrence mechanisms and further developing Active Traffic Management (ATM) control strategies to improve traffic safety, this study developed disaggregate crash risk analysis models with loop detector traffic data and historical crash data. Bayesian random effects logistic regression models were utilized as it can account for the unobserved heterogeneity among crashes. However, previous crash risk analysis studies formulated random effects distributions in a parametric approach, which assigned them to follow normal distributions. Due to the limited information known about random effects distributions, subjective parametric setting may be incorrect. In order to construct more flexible and robust random effects to capture the unobserved heterogeneity, Bayesian semi-parametric inference technique was introduced to crash risk analysis in this study. Models with both inference techniques were developed for total crashes; semi-parametric models were proved to provide substantial better model goodness-of-fit, while the two models shared consistent coefficient estimations. Later on, Bayesian semi-parametric random effects logistic regression models were developed for weekday peak hour crashes, weekday non-peak hour crashes, and weekend non-peak hour crashes to investigate different crash occurrence scenarios. Significant factors that affect crash risk have been revealed and crash mechanisms have been concluded.

  8. Default Bayesian analysis for multi-way tables: a data-augmentation approach

    CERN Document Server

    Polson, Nicholas G

    2011-01-01

    This paper proposes a strategy for regularized estimation in multi-way contingency tables, which are common in meta-analyses and multi-center clinical trials. Our approach is based on data augmentation, and appeals heavily to a novel class of Polya-Gamma distributions. Our main contributions are to build up the relevant distributional theory and to demonstrate three useful features of this data-augmentation scheme. First, it leads to simple EM and Gibbs-sampling algorithms for posterior inference, circumventing the need for analytic approximations, numerical integration, Metropolis--Hastings, or variational methods. Second, it allows modelers much more flexibility when choosing priors, which have traditionally come from the Dirichlet or logistic-normal family. For example, our approach allows users to incorporate Bayesian analogues of classical penalized-likelihood techniques (e.g. the lasso or bridge) in computing regularized estimates for log-odds ratios. Finally, our data-augmentation scheme naturally sugg...

  9. A hierarchical Bayesian approach for reconstructing the Initial Mass Function of Single Stellar Populations

    CERN Document Server

    Dries, M; Koopmans, L V E

    2016-01-01

    Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov Chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age, and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities, and IMFs. When systematic unc...

  10. Fast SAR Image Change Detection Using Bayesian Approach Based Difference Image and Modified Statistical Region Merging

    Directory of Open Access Journals (Sweden)

    Han Zhang

    2014-01-01

    Full Text Available A novel fast SAR image change detection method is presented in this paper. Based on a Bayesian approach, the prior information that speckles follow the Nakagami distribution is incorporated into the difference image (DI generation process. The new DI performs much better than the familiar log ratio (LR DI as well as the cumulant based Kullback-Leibler divergence (CKLD DI. The statistical region merging (SRM approach is first introduced to change detection context. A new clustering procedure with the region variance as the statistical inference variable is exhibited to tailor SAR image change detection purposes, with only two classes in the final map, the unchanged and changed classes. The most prominent advantages of the proposed modified SRM (MSRM method are the ability to cope with noise corruption and the quick implementation. Experimental results show that the proposed method is superior in both the change detection accuracy and the operation efficiency.

  11. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    DEFF Research Database (Denmark)

    Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.;

    2016-01-01

    to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert...... with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based...

  12. Heart rate variability estimation in photoplethysmography signals using Bayesian learning approach

    Science.gov (United States)

    Alwosheel, Ahmad; Alasaad, Amr

    2016-01-01

    Heart rate variability (HRV) has become a marker for various health and disease conditions. Photoplethysmography (PPG) sensors integrated in wearable devices such as smart watches and phones are widely used to measure heart activities. HRV requires accurate estimation of time interval between consecutive peaks in the PPG signal. However, PPG signal is very sensitive to motion artefact which may lead to poor HRV estimation if false peaks are detected. In this Letter, the authors propose a probabilistic approach based on Bayesian learning to better estimate HRV from PPG signal recorded by wearable devices and enhance the performance of the automatic multi scale-based peak detection (AMPD) algorithm used for peak detection. The authors’ experiments show that their approach enhances the performance of the AMPD algorithm in terms of number of HRV related metrics such as sensitivity, positive predictive value, and average temporal resolution. PMID:27382483

  13. Integrated survival analysis using an event-time approach in a Bayesian framework

    Science.gov (United States)

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the

  14. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    Science.gov (United States)

    Sudhan Reddy Gudur, Madhu; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-11-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10-4), 283 for the intensity approach (p = 2  ×  10-6) and 282 without density

  15. Dynamic probability evaluation of safety levels of earth-rockfill dams using Bayesian approach

    Directory of Open Access Journals (Sweden)

    Zi-wu FAN

    2009-06-01

    Full Text Available In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural characteristics of dams, it is quite difficult to predict the time-varying factors affecting their safety levels. It is not feasible to employ dynamic reliability indices to evaluate the actual safety levels of dams. Based on the relevant regulations for dam safety classification in China, a dynamic probability description of dam safety levels was developed. Using the Bayesian approach and effective information mining, as well as real-time information, this study achieved more rational evaluation and prediction of dam safety levels. With the Bayesian expression of discrete stochastic variables, the a priori probabilities of the dam safety levels determined by experts were combined with the likelihood probability of the real-time check information, and the probability information for the evaluation of dam safety levels was renewed. The probability index was then applied to dam rehabilitation decision-making. This method helps reduce the difficulty and uncertainty of the evaluation of dam safety levels and complies with the current safe decision-making regulations for dams in China. It also enhances the application of current risk analysis methods for dam safety levels

  16. A Bayesian approach to the semi-analytic model of galaxy formation: methodology

    Science.gov (United States)

    Lu, Yu; Mo, H. J.; Weinberg, Martin D.; Katz, Neal

    2011-09-01

    We believe that a wide range of physical processes conspire to shape the observed galaxy population, but we remain unsure of their detailed interactions. The semi-analytic model (SAM) of galaxy formation uses multidimensional parametrizations of the physical processes of galaxy formation and provides a tool to constrain these underlying physical interactions. Because of the high dimensionality, the parametric problem of galaxy formation may be profitably tackled with a Bayesian-inference-based approach, which allows one to constrain theory with data in a statistically rigorous way. In this paper, we develop a SAM in the framework of Bayesian inference. We show that, with a parallel implementation of an advanced Markov chain Monte Carlo algorithm, it is now possible to rigorously sample the posterior distribution of the high-dimensional parameter space of typical SAMs. As an example, we characterize galaxy formation in the current Λ cold dark matter cosmology using the stellar mass function of galaxies as an observational constraint. We find that the posterior probability distribution is both topologically complex and degenerate in some important model parameters, suggesting that thorough explorations of the parameter space are needed to understand the models. We also demonstrate that because of the model degeneracy, adopting a narrow prior strongly restricts the model. Therefore, the inferences based on SAMs are conditional to the model adopted. Using synthetic data to mimic systematic errors in the stellar mass function, we demonstrate that an accurate observational error model is essential to meaningful inference.

  17. Forecasting Rainfall Time Series with stochastic output approximated by neural networks Bayesian approach

    Directory of Open Access Journals (Sweden)

    Cristian Rodriguez Rivero

    2014-07-01

    Full Text Available The annual estimate of the availability of the amount of water for the agricultural sector has become a lifetime in places where rainfall is scarce, as is the case of northwestern Argentina. This work proposes to model and simulate monthly rainfall time series from one geographical location of Catamarca, Valle El Viejo Portezuelo. In this sense, the time series prediction is mathematical and computational modelling series provided by monthly cumulative rainfall, which has stochastic output approximated by neural networks Bayesian approach. We propose to use an algorithm based on artificial neural networks (ANNs using the Bayesian inference. The result of the prediction consists of 20% of the provided data consisting of 2000 to 2010. A new analysis for modelling, simulation and computational prediction of cumulative rainfall from one geographical location is well presented. They are used as data information, only the historical time series of daily flows measured in mmH2O. Preliminary results of the annual forecast in mmH2O with a prediction horizon of one year and a half are presented, 18 months, respectively. The methodology employs artificial neural network based tools, statistical analysis and computer to complete the missing information and knowledge of the qualitative and quantitative behavior. They also show some preliminary results with different prediction horizons of the proposed filter and its comparison with the performance Gaussian process filter used in the literature.

  18. Assessment of successful smoking cessation by psychological factors using the Bayesian network approach.

    Science.gov (United States)

    Yang, Xiaorong; Li, Suyun; Pan, Lulu; Wang, Qiang; Li, Huijie; Han, Mingkui; Zhang, Nan; Jiang, Fan; Jia, Chongqi

    2016-07-01

    The association between psychological factors and smoking cessation is complicated and inconsistent in published researches, and the joint effect of psychological factors on smoking cessation is unclear. This study explored how psychological factors jointly affect the success of smoking cessation using a Bayesian network approach. A community-based case control study was designed with 642 adult male successful smoking quitters as the cases, and 700 adult male failed smoking quitters as the controls. General self-efficacy (GSE), trait coping style (positive-trait coping style (PTCS) and negative-trait coping style (NTCS)) and self-rating anxiety (SA) were evaluated by GSE Scale, Trait Coping Style Questionnaire and SA Scale, respectively. Bayesian network was applied to evaluate the relationship between psychological factors and successful smoking cessation. The local conditional probability table of smoking cessation indicated that different joint conditions of psychological factors led to different outcomes for smoking cessation. Among smokers with high PTCS, high NTCS and low SA, only 36.40% successfully quitted smoking. However, among smokers with low pack-years of smoking, high GSE, high PTCS and high SA, 63.64% successfully quitted smoking. Our study indicates psychological factors jointly influence smoking cessation outcome. According to different joint situations, different solutions should be developed to control tobacco in practical intervention.

  19. Estimation of under-reported visceral Leishmaniasis (Vl cases in Bihar: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    A Ranjan

    2013-12-01

    Full Text Available Background: Visceral leishmaniasis (VL is a major health problem in the state of Bihar and adjoining areas in India. In absence of any active surveillance mechanism for the disease, there seems to be gross under-reporting of VL cases. Objective: The objective of this study was to estimate extent of under-reporting of VL cases in Bihar using pooled analysis of published papers. Method: We calculated the pooled common ratio (RRMH based on three studies and combined it with a prior distribution of ratio using inverse-variance weighting method. Bayesian method was used to estimate the posterior distribution of the “under-reporting factor” (ratio of unreported to reported cases. Results: The posterior distribution of ratio of unreported to reported cases yielded a mean of 3.558, with 95% posterior limits of 2.81 and 4.50. Conclusion: Bayesian approach gives evidence to the fact that the total number of VL cases in the state may be nearly more than three times that of currently reported figures. 

  20. Dynamic probability evaluation of safety levels of earth-rockfill dams using Bayesian approach

    Institute of Scientific and Technical Information of China (English)

    Zi-wu FAN; Shu-hai JIANG; Ming ZHANG

    2009-01-01

    In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural characteristics of dams, it is quite difficult to predict the time-varying factors affecting their safety levels. It is not feasible to employ dynamic reliability indices to evaluate the actual safety levels of dams. Based on the relevant regulations for dam safety classification in China, a dynamic probability description of dam safety levels was developed. Using the Bayesian approach and effective information mining, as well as real-time information, this study achieved more rational evaluation and prediction of dam safety levels. With the Bayesian expression of discrete stochastic variables, the a priori probabilities of the dam safety levels determined by experts were combined with the likelihood probability of the real-time check information, and the probability information for the evaluation of dam safety levels was renewed. The probability index was then applied to dam rehabilitation decision-making. This method helps reduce the difficulty and uncertainty of the evaluation of dam safety levels and complies with the current safe decision-making regulations for dams in China. It also enhances the application of current risk analysis methods for dam safety levels.

  1. A bayesian approach to inferring the genetic population structure of sugarcane accessions from INTA (Argentina

    Directory of Open Access Journals (Sweden)

    Mariana Inés Pocovi

    2015-06-01

    Full Text Available Understanding the population structure and genetic diversity in sugarcane (Saccharum officinarum L. accessions from INTA germplasm bank (Argentina will be of great importance for germplasm collection and breeding improvement as it will identify diverse parental combinations to create segregating progenies with maximum genetic variability for further selection. A Bayesian approach, ordination methods (PCoA, Principal Coordinate Analysis and clustering analysis (UPGMA, Unweighted Pair Group Method with Arithmetic Mean were applied to this purpose. Sixty three INTA sugarcane hybrids were genotyped for 107 Simple Sequence Repeat (SSR and 136 Amplified Fragment Length Polymorphism (AFLP loci. Given the low probability values found with AFLP for individual assignment (4.7%, microsatellites seemed to perform better (54% for STRUCTURE analysis that revealed the germplasm to exist in five optimum groups with partly corresponding to their origin. However clusters shown high degree of admixture, F ST values confirmed the existence of differences among groups. Dissimilarity coefficients ranged from 0.079 to 0.651. PCoA separated sugarcane in groups that did not agree with those identified by STRUCTURE. The clustering including all genotypes neither showed resemblance to populations find by STRUCTURE, but clustering performed considering only individuals displaying a proportional membership > 0.6 in their primary population obtained with STRUCTURE showed close similarities. The Bayesian method indubitably brought more information on cultivar origins than classical PCoA and hierarchical clustering method.

  2. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  3. Bayesian Compressive Sensing Approaches for Direction of Arrival Estimation With Mutual Coupling Effects

    Science.gov (United States)

    Hawes, Matthew; Mihaylova, Lyudmila; Septier, Francois; Godsill, Simon

    2017-03-01

    The problem of estimating the dynamic direction of arrival of far field signals impinging on a uniform linear array, with mutual coupling effects, is addressed. This work proposes two novel approaches able to provide accurate solutions, including at the endfire regions of the array. Firstly, a Bayesian compressive sensing Kalman filter is developed, which accounts for the predicted estimated signals rather than using the traditional sparse prior. The posterior probability density function of the received source signals and the expression for the related marginal likelihood function are derived theoretically. Next, a Gibbs sampling based approach with indicator variables in the sparsity prior is developed. This allows sparsity to be explicitly enforced in different ways, including when an angle is too far from the previous estimate. The proposed approaches are validated and evaluated over different test scenarios and compared to the traditional relevance vector machine based method. An improved accuracy in terms of average root mean square error values is achieved (up to 73.39% for the modified relevance vector machine based approach and 86.36% for the Gibbs sampling based approach). The proposed approaches prove to be particularly useful for direction of arrival estimation when the angle of arrival moves into the endfire region of the array.

  4. A Bayesian Approach for Apparent Inter-plate Coupling in the Central Andes Subduction Zone

    Science.gov (United States)

    Ortega Culaciati, F. H.; Simons, M.; Genrich, J. F.; Galetzka, J.; Comte, D.; Glass, B.; Leiva, C.; Gonzalez, G.; Norabuena, E. O.

    2010-12-01

    We aim to characterize the extent of apparent plate coupling on the subduction zone megathrust with the eventual goal of understanding spatial variations of fault zone rheology, inferring relationships between apparent coupling and the rupture zone of big earthquakes, as well as the implications for earthquake and tsunami hazard. Unlike previous studies, we approach the problem from a Bayesian perspective, allowing us to completely characterize the model parameter space by searching a posteriori estimates of the range of allowable models instead of seeking a single optimum model. Two important features of the Bayesian approach are the possibility to easily implement any kind of physically plausible a priori information and to perform the inversion without regularization, other than that imposed by the way in which we parameterize the forward model. Adopting a simple kinematic back-slip model and a 3D geometry of the inter-plate contact zone, we can estimate the probability of apparent coupling (Pc) along the plate interface that is consistent with a priori information (e.g., approximate rake of back-slip) and available geodetic measurements. More generally, the Bayesian approach adopted here is applicable to any region and eventually would allow one to evaluate the spatial relationship between various inferred distributions of fault behavior (e.g., seismic rupture, postseismic creep, and apparent interseismic coupling) in a quantifiable manner. We apply this methodology to evaluate the state of apparent inter-seismic coupling in the Chilean-Peruvian subduction margin (12 S - 25 S). As observational constraints, we use previously published horizontal velocities from campaign GPS [Kendrick et al., 2001, 2006] as well as 3 component velocities from a recently established continuous GPS network in the region (CAnTO). We compare results from both joint and independent use of these data sets. We obtain patch like features for Pc with higher values located above 60 km

  5. A Bayesian approach for temporally scaling climate for modeling ecological systems.

    Science.gov (United States)

    Post van der Burg, Max; Anteau, Michael J; McCauley, Lisa A; Wiltermuth, Mark T

    2016-05-01

    With climate change becoming more of concern, many ecologists are including climate variables in their system and statistical models. The Standardized Precipitation Evapotranspiration Index (SPEI) is a drought index that has potential advantages in modeling ecological response variables, including a flexible computation of the index over different timescales. However, little development has been made in terms of the choice of timescale for SPEI. We developed a Bayesian modeling approach for estimating the timescale for SPEI and demonstrated its use in modeling wetland hydrologic dynamics in two different eras (i.e., historical [pre-1970] and contemporary [post-2003]). Our goal was to determine whether differences in climate between the two eras could explain changes in the amount of water in wetlands. Our results showed that wetland water surface areas tended to be larger in wetter conditions, but also changed less in response to climate fluctuations in the contemporary era. We also found that the average timescale parameter was greater in the historical period, compared with the contemporary period. We were not able to determine whether this shift in timescale was due to a change in the timing of wet-dry periods or whether it was due to changes in the way wetlands responded to climate. Our results suggest that perhaps some interaction between climate and hydrologic response may be at work, and further analysis is needed to determine which has a stronger influence. Despite this, we suggest that our modeling approach enabled us to estimate the relevant timescale for SPEI and make inferences from those estimates. Likewise, our approach provides a mechanism for using prior information with future data to assess whether these patterns may continue over time. We suggest that ecologists consider using temporally scalable climate indices in conjunction with Bayesian analysis for assessing the role of climate in ecological systems.

  6. A Bayesian approach for temporally scaling climate for modeling ecological systems

    Science.gov (United States)

    Post van der Burg, Max; Anteau, Michael J.; McCauley, Lisa A.; Wiltermuth, Mark T.

    2016-01-01

    With climate change becoming more of concern, many ecologists are including climate variables in their system and statistical models. The Standardized Precipitation Evapotranspiration Index (SPEI) is a drought index that has potential advantages in modeling ecological response variables, including a flexible computation of the index over different timescales. However, little development has been made in terms of the choice of timescale for SPEI. We developed a Bayesian modeling approach for estimating the timescale for SPEI and demonstrated its use in modeling wetland hydrologic dynamics in two different eras (i.e., historical [pre-1970] and contemporary [post-2003]). Our goal was to determine whether differences in climate between the two eras could explain changes in the amount of water in wetlands. Our results showed that wetland water surface areas tended to be larger in wetter conditions, but also changed less in response to climate fluctuations in the contemporary era. We also found that the average timescale parameter was greater in the historical period, compared with the contemporary period. We were not able to determine whether this shift in timescale was due to a change in the timing of wet–dry periods or whether it was due to changes in the way wetlands responded to climate. Our results suggest that perhaps some interaction between climate and hydrologic response may be at work, and further analysis is needed to determine which has a stronger influence. Despite this, we suggest that our modeling approach enabled us to estimate the relevant timescale for SPEI and make inferences from those estimates. Likewise, our approach provides a mechanism for using prior information with future data to assess whether these patterns may continue over time. We suggest that ecologists consider using temporally scalable climate indices in conjunction with Bayesian analysis for assessing the role of climate in ecological systems.

  7. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    Science.gov (United States)

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016.

  8. Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction

    CERN Document Server

    Smith, Jeffrey C; Van Cleve, Jeffrey E; Jenkins, Jon M; Barclay, Thomas S; Fanelli, Michael N; Girouard, Forrest R; Kolodziejczak, Jeffery J; McCauliff, Sean D; Morris, Robert L; Twicken, Joseph D

    2012-01-01

    With the unprecedented photometric precision of the Kepler Spacecraft, significant systematic and stochastic errors on transit signal levels are observable in the Kepler photometric data. These errors, which include discontinuities, outliers, systematic trends and other instrumental signatures, obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline tries to remove these errors while preserving planet transits and other astrophysically interesting signals. The completely new noise and stellar variability regime observed in Kepler data poses a significant problem to standard cotrending methods such as SYSREM and TFA. Variable stars are often of particular astrophysical interest so the preservation of their signals is of significant importance to the astrophysical community. We present a Bayesian Maximum A Posteriori (MAP) approach where a subset of highly correlated and quiet stars is used to generate a cotrending basis vector set which is in turn used t...

  9. An efficient multiple particle filter based on the variational Bayesian approach

    KAUST Repository

    Ait-El-Fquih, Boujemaa

    2015-12-07

    This paper addresses the filtering problem in large-dimensional systems, in which conventional particle filters (PFs) remain computationally prohibitive owing to the large number of particles needed to obtain reasonable performances. To overcome this drawback, a class of multiple particle filters (MPFs) has been recently introduced in which the state-space is split into low-dimensional subspaces, and then a separate PF is applied to each subspace. In this paper, we adopt the variational Bayesian (VB) approach to propose a new MPF, the VBMPF. The proposed filter is computationally more efficient since the propagation of each particle requires generating one (new) particle only, while in the standard MPFs a set of (children) particles needs to be generated. In a numerical test, the proposed VBMPF behaves better than the PF and MPF.

  10. A Bayesian Approach to the Orientations of Central Alentejo Megalithic Enclosures

    Science.gov (United States)

    Pimenta, Fernando; Tirapicos, Luís; Smith, Andrew

    2009-12-01

    In this work we have conducted a study on the orientations in the landscape of twelve megalithic enclosures in the Alentejo region of southern Portugal. Some of these sites date back to the sixth or fifth millennium B.C. and are among the oldest stone enclosures in Europe. The results of the survey show a pattern toward eastern rising orientations. We used dedicated GIS software from one of the authors to produce horizon profiles and applied a statistical Bayesian approach in an attempt to check how the data would fit to different models. In particular, we tested our results for a possible ritual interest in the Autumn or Harvest Full Moon and discuss previous studies by Michael Hoskin and colleges on the orientations of seven stone dolmens of this area that have shown the existence of a possible custom for an orientation toward the sunrise.

  11. Spectro-photometric distances to stars: a general-purpose Bayesian approach

    CERN Document Server

    Santiago, Basílio X; Anders, Friedrich; Chiappini, Cristina; Girardi, Léo; Rocha-Pinto, Helio J; Balbinot, Eduardo; da Costa, Luiz N; Maia, Marcio A G; Schultheis, Mathias; Steinmetz, Matthias; Miglio, Andrea; Montalbán, Josefina; Schneider, Donald P; Beers, Timothy C; Frinchaboy, Peter M; Lee, Young Sun; Zasowski, Gail

    2016-01-01

    We have developed a procedure that estimates distances to stars using measured spectroscopic and photometric quantities. It employs a Bayesian approach to build the probability distribution function over stellar evolutionary models given the data, delivering estimates of expected distance for each star individually. Our method provides several alternative distance estimates for each star in the output, along with their associated uncertainties. The code was first tested on simulations, successfully recovering input distances to mock stars with errors that scale with the uncertainties in the adopted spectro-photometric parameters, as expected. The code was then validated by comparing our distance estimates to parallax measurements from the Hipparcos mission for nearby stars (< 60 pc), to asteroseismic distances of CoRoT red giant stars, and to known distances of well-studied open and globular clusters. The photometric data of these reference samples cover both the optical and near infra-red wavelengths. The...

  12. Posterior Consistency of the Bayesian Approach to Linear Ill-Posed Inverse Problems

    CERN Document Server

    Agapiou, Sergios; Stuart, Andrew M

    2012-01-01

    We consider a Bayesian nonparametric approach to a family of linear inverse problems in a separable Hilbert space setting, with Gaussian prior and noise distribution. A method of identifying the posterior distribution using its precision operator is presented. Working with the unbounded precision operator enables us to use partial differential equations (PDE) methodology to study posterior consistency in a frequentist sense, and in particular to obtain rates of contraction of the posterior distribution to a Dirac measure centered on the true solution. We show how these rates may be optimized by a choice of the scale parameter in the prior covariance operator. Our methods assume a relatively weak relation between the prior covariance operator, the forward operator and the noise covariance operator; more precisely, we assume that appropriate powers of these operators induce equivalent norms. We compare our results to known minimax rates of convergence in the case where the forward operator and the prior and noi...

  13. A Bayesian approach to quantifying uncertainty from experimental noise in DEER spectroscopy

    Science.gov (United States)

    Edwards, Thomas H.; Stoll, Stefan

    2016-09-01

    Double Electron-Electron Resonance (DEER) spectroscopy is a solid-state pulse Electron Paramagnetic Resonance (EPR) experiment that measures distances between unpaired electrons, most commonly between protein-bound spin labels separated by 1.5-8 nm. From the experimental data, a distance distribution P (r) is extracted using Tikhonov regularization. The disadvantage of this method is that it does not directly provide error bars for the resulting P (r) , rendering correct interpretation difficult. Here we introduce a Bayesian statistical approach that quantifies uncertainty in P (r) arising from noise and numerical regularization. This method provides credible intervals (error bars) of P (r) at each r . This allows practitioners to answer whether or not small features are significant, whether or not apparent shoulders are significant, and whether or not two distance distributions are significantly different from each other. In addition, the method quantifies uncertainty in the regularization parameter.

  14. A Bayesian Approach Accounting for Stochastic Fluctuations in Stellar Cluster Properties

    CERN Document Server

    Fouesneau, M

    2009-01-01

    The integrated spectro-photometric properties of star clusters are subject to large cluster-to-cluster variations. They are distributed in non trivial ways around the average properties predicted by standard population synthesis models. This results from the stochastic mass distribution of the finite (small) number of luminous stars in each cluster, stars which may be either particularly blue or particularly red. The color distributions are broad and usually far from Gaussian, especially for young and intermediate age clusters, as found in interacting galaxies. When photometric measurements of clusters are used to estimate ages and masses in conjunction with standard models, biases are to be expected. We present a Bayesian approach that explicitly accounts for stochasticity when estimating ages and masses of star clusters that cannot be resolved into stars. Based on Monte-Carlo simulations, we are starting to explore the probability distributions of star cluster properties obtained given a set of multi-wavele...

  15. a Bayesian Approach for Calibration of Trmm 3B42 Over North Amazonia

    Science.gov (United States)

    Linguet, L.; Marie-Joseph, I.; Becker, M.; Seyler, F.

    2013-12-01

    Northern Amazonian regions experience extremes conditions like floods and droughts. These regions are also characterized by the limited spatial coverage of ground based rain gauges, and unavailability of real-time rainfall data. Satellite-based rainfall estimates (SRE) may be one of the best and appropriate approaches in detecting rainfall distribution. However SRE data need specific calibration and validation for use in flood and drought monitoring activities. This study aimed to calibrate of TRMM 3B42 RT rainfall products over northern Amazonia with a Bayesian filtering approach [1] [2]. The study area is located north of the Amazon River and includes the three Guianas and northern states of Brazil. A set of daily satellite rainfall products with spatial resolution of 0.25°x0.25° (TRMM 3B42 RT) from the year 2000 to 2010 has been selected. Ground reference data are located in French Guiana (27 ground stations from French national meteorological agency) and in the northern Brazilian states (70 ground stations from Brazilian Agência Nacional de Aguas). A lot of bias-adjustment methods rely on computing the difference between satellite and gauge-based precipitation [3] [4]. In this study we defend the idea that an inverse approach based on sequential Monte Carlo filtering helps to calibrate of TRMM 3B42 RT rainfall products. The developed method combines a model of the rainfall process at rain gauge locations with a stochastic observation model based on the joint distribution between ground reference data of the state variable (rainfall data) and the observed satellite data. 50% of the total ground based rainfall measurements were used for the joint distribution and the remaining 50% were used for validation purposes. Validation of the method has been done by comparing the corrected satellite data against independent observed data from rain gauges using the standard verification techniques: mean bias error, root mean square error, and correlation coefficient

  16. Applying a Bayesian Approach to Identification of Orthotropic Elastic Constants from Full Field Displacement Measurements

    Directory of Open Access Journals (Sweden)

    Le Riche R.

    2010-06-01

    Full Text Available A major challenge in the identification of material properties is handling different sources of uncertainty in the experiment and the modelling of the experiment for estimating the resulting uncertainty in the identified properties. Numerous improvements in identification methods have provided increasingly accurate estimates of various material properties. However, characterizing the uncertainty in the identified properties is still relatively crude. Different material properties obtained from a single test are not obtained with the same confidence. Typically the highest uncertainty is associated with respect to properties to which the experiment is the most insensitive. In addition, the uncertainty in different properties can be strongly correlated, so that obtaining only variance estimates may be misleading. A possible approach for handling the different sources of uncertainty and estimating the uncertainty in the identified properties is the Bayesian method. This method was introduced in the late 1970s in the context of identification [1] and has been applied since to different problems, notably identification of elastic constants from plate vibration experiments [2]-[4]. The applications of the method to these classical pointwise tests involved only a small number of measurements (typically ten natural frequencies in the previously cited vibration test which facilitated the application of the Bayesian approach. For identifying elastic constants, full field strain or displacement measurements provide a high number of measured quantities (one measurement per image pixel and hence a promise of smaller uncertainties in the properties. However, the high number of measurements represents also a major computational challenge in applying the Bayesian approach to full field measurements. To address this challenge we propose an approach based on the proper orthogonal decomposition (POD of the full fields in order to drastically reduce their

  17. Applying a Bayesian Approach to Identification of Orthotropic Elastic Constants from Full Field Displacement Measurements

    Science.gov (United States)

    Gogu, C.; Yin, W.; Haftka, R.; Ifju, P.; Molimard, J.; Le Riche, R.; Vautrin, A.

    2010-06-01

    A major challenge in the identification of material properties is handling different sources of uncertainty in the experiment and the modelling of the experiment for estimating the resulting uncertainty in the identified properties. Numerous improvements in identification methods have provided increasingly accurate estimates of various material properties. However, characterizing the uncertainty in the identified properties is still relatively crude. Different material properties obtained from a single test are not obtained with the same confidence. Typically the highest uncertainty is associated with respect to properties to which the experiment is the most insensitive. In addition, the uncertainty in different properties can be strongly correlated, so that obtaining only variance estimates may be misleading. A possible approach for handling the different sources of uncertainty and estimating the uncertainty in the identified properties is the Bayesian method. This method was introduced in the late 1970s in the context of identification [1] and has been applied since to different problems, notably identification of elastic constants from plate vibration experiments [2]-[4]. The applications of the method to these classical pointwise tests involved only a small number of measurements (typically ten natural frequencies in the previously cited vibration test) which facilitated the application of the Bayesian approach. For identifying elastic constants, full field strain or displacement measurements provide a high number of measured quantities (one measurement per image pixel) and hence a promise of smaller uncertainties in the properties. However, the high number of measurements represents also a major computational challenge in applying the Bayesian approach to full field measurements. To address this challenge we propose an approach based on the proper orthogonal decomposition (POD) of the full fields in order to drastically reduce their dimensionality. POD is

  18. A Bayesian network approach for modeling local failure in lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jung Hun; Craft, Jeffrey; Al Lozi, Rawan; Vaidya, Manushka; Meng, Yifan; Deasy, Joseph O; Bradley, Jeffrey D; El Naqa, Issam, E-mail: elnaqa@wustl.edu [Department of Radiation Oncology, Mallinckrodt Institute of Radiology, Washington University School of Medicine, MO 63110 (United States)

    2011-03-21

    Locally advanced non-small cell lung cancer (NSCLC) patients suffer from a high local failure rate following radiotherapy. Despite many efforts to develop new dose-volume models for early detection of tumor local failure, there was no reported significant improvement in their application prospectively. Based on recent studies of biomarker proteins' role in hypoxia and inflammation in predicting tumor response to radiotherapy, we hypothesize that combining physical and biological factors with a suitable framework could improve the overall prediction. To test this hypothesis, we propose a graphical Bayesian network framework for predicting local failure in lung cancer. The proposed approach was tested using two different datasets of locally advanced NSCLC patients treated with radiotherapy. The first dataset was collected retrospectively, which comprises clinical and dosimetric variables only. The second dataset was collected prospectively in which in addition to clinical and dosimetric information, blood was drawn from the patients at various time points to extract candidate biomarkers as well. Our preliminary results show that the proposed method can be used as an efficient method to develop predictive models of local failure in these patients and to interpret relationships among the different variables in the models. We also demonstrate the potential use of heterogeneous physical and biological variables to improve the model prediction. With the first dataset, we achieved better performance compared with competing Bayesian-based classifiers. With the second dataset, the combined model had a slightly higher performance compared to individual physical and biological models, with the biological variables making the largest contribution. Our preliminary results highlight the potential of the proposed integrated approach for predicting post-radiotherapy local failure in NSCLC patients.

  19. Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion

    Science.gov (United States)

    Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.

    2015-12-01

    Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.

  20. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  1. A Bayesian approach to combine Landsat and ALOS PALSAR time series for near real-time deforestation detection

    NARCIS (Netherlands)

    Reiche, J.; Bruin, de S.; Hoekman, D.H.; Verbesselt, J.; Herold, M.

    2015-01-01

    To address the need for timely information on newly deforested areas at medium resolution scale, we introduce a Bayesian approach to combine SAR and optical time series for near real-time deforestation detection. Once a new image of either of the input time series is available, the conditional proba

  2. Psychological Needs, Engagement, and Work Intentions: A Bayesian Multi-Measurement Mediation Approach and Implications for HRD

    Science.gov (United States)

    Shuck, Brad; Zigarmi, Drea; Owen, Jesse

    2015-01-01

    Purpose: The purpose of this study was to empirically examine the utility of self-determination theory (SDT) within the engagement-performance linkage. Design/methodology/approach: Bayesian multi-measurement mediation modeling was used to estimate the relation between SDT, engagement and a proxy measure of performance (e.g. work intentions) (N =…

  3. A Bayesian Network approach to the evaluation of building design and its consequences for employee performance and operational costs

    DEFF Research Database (Denmark)

    Jensen, Kasper Lynge; Toftum, Jørn; Friis-Hansen, Peter

    2009-01-01

    A Bayesian Network approach has been developed that can compare different building designs by estimating the effects of the thermal indoor environment on the mental performance of office workers. A part of this network is based on the compilation of subjective thermal sensation data...... that investments in improved indoor thermal conditions can be justified economically in most cases. The Bayesian Network provides a reliable platform using probabilities for modelling the complexity while estimating the effect of indoor climate factors on human beings, due to the different ways in which humans...

  4. A study of finite mixture model: Bayesian approach on financial time series data

    Science.gov (United States)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-07-01

    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  5. The Bayesian approximation error approach for electrical impedance tomography—experimental results

    Science.gov (United States)

    Nissinen, A.; Heikkinen, L. M.; Kaipio, J. P.

    2008-01-01

    Inverse problems can be characterized as problems that tolerate measurement and modelling errors poorly. While the measurement error issue has been widely considered as a solved problem, the modelling errors have remained largely untreated. The approximation and modelling errors can, however, be argued to dominate the measurement errors in most applications. There are several applications in which the temporal and memory requirements dictate that the computational complexity of the forward solver be radically reduced. For example, in process tomography the reconstructions have to be carried out typically in a few tens of milliseconds. Recently, a Bayesian approach for the treatment of approximation and modelling errors for inverse problems has been proposed. This approach has proven to work well in several classes of problems, but the approach has not been verified in any problem with real data. In this paper, we study two different types of modelling errors in the case of electrical impedance tomography: one related to model reduction and one concerning partially unknown geometry. We show that the approach is also feasible in practice and may facilitate the reduction of the computational complexity of the nonlinear EIT problem at least by an order of magnitude.

  6. Controlling the degree of caution in statistical inference with the Bayesian and frequentist approaches as opposite extremes

    CERN Document Server

    Bickel, David R

    2011-01-01

    In statistical practice, whether a Bayesian or frequentist approach is used in inference depends not only on the availability of prior information but also on the attitude taken toward partial prior information, with frequentists tending to be more cautious than Bayesians. The proposed framework defines that attitude in terms of a specified amount of caution, thereby enabling data analysis at the level of caution desired and on the basis of any prior information. The caution parameter represents the attitude toward partial prior information in much the same way as a loss function represents the attitude toward risk. When there is very little prior information and nonzero caution, the resulting inferences correspond to those of the candidate confidence intervals and p-values that are most similar to the credible intervals and hypothesis probabilities of the specified Bayesian posterior. On the other hand, in the presence of a known physical distribution of the parameter, inferences are based only on the corres...

  7. Bayesian Recovery of Clipped OFDM Signals: A Receiver-based Approach

    KAUST Repository

    Al-Rabah, Abdullatif R.

    2013-05-01

    Recently, orthogonal frequency-division multiplexing (OFDM) has been adopted for high-speed wireless communications due to its robustness against multipath fading. However, one of the main fundamental drawbacks of OFDM systems is the high peak-to-average-power ratio (PAPR). Several techniques have been proposed for PAPR reduction. Most of these techniques require transmitter-based (pre-compensated) processing. On the other hand, receiver-based alternatives would save the power and reduce the transmitter complexity. By keeping this in mind, a possible approach is to limit the amplitude of the OFDM signal to a predetermined threshold and equivalently a sparse clipping signal is added. Then, estimating this clipping signal at the receiver to recover the original signal. In this work, we propose a Bayesian receiver-based low-complexity clipping signal recovery method for PAPR reduction. The method is able to i) effectively reduce the PAPR via simple clipping scheme at the transmitter side, ii) use Bayesian recovery algorithm to reconstruct the clipping signal at the receiver side by measuring part of subcarriers, iii) perform well in the absence of statistical information about the signal (e.g. clipping level) and the noise (e.g. noise variance), and at the same time iv is energy efficient due to its low complexity. Specifically, the proposed recovery technique is implemented in data-aided based. The data-aided method collects clipping information by measuring reliable 
data subcarriers, thus makes full use of spectrum for data transmission without the need for tone reservation. The study is extended further to discuss how to improve the recovery of the clipping signal utilizing some features of practical OFDM systems i.e., the oversampling and the presence of multiple receivers. Simulation results demonstrate the superiority of the proposed technique over other recovery algorithms. The overall objective is to show that the receiver-based Bayesian technique is highly

  8. On merging rainfall data from diverse sources using a Bayesian approach

    Science.gov (United States)

    Bhattacharya, Biswa; Tarekegn, Tegegne

    2014-05-01

    Numerous studies have presented comparison of satellite rainfall products, such as from Tropical Rainfall Measuring Mission (TRMM), with rain gauge data and have concluded, in general, that the two sources of data are comparable at suitable space and time scales. The comparison is not a straightforward one as they employ different measurement techniques and are dependent on very different space-time scales of measurements. The number of available gauges in a catchment also influences the comparability and thus adds to the complexity. The TRMM rainfall data also has been directly used in hydrological modelling. As the space-time scale reduces so does the accuracy of these models. It seems that combining the two sources of rainfall data, or more sources of rainfall data, can enormously benefit hydrological studies. Various rainfall data, due to the differences in their space-time structure, contains information about the spatio-temporal distribution of rainfall, which is not available to a single source of data. In order to harness this benefit we have developed a method of merging these two (or more) rainfall products under the framework of Bayesian Data Fusion (BDF) principle. By applying this principle the rainfall data from the various sources can be combined to a single time series of rainfall data. The usefulness of the approach has been explored in a case study on Lake Tana Basin of Upper Blue Nile Basin in Ethiopia. A 'leave one rain gauge out' cross validation technique was employed for evaluating the accuracy of the rainfall time series with rainfall interpolated from rain gauge data using Inverse Distance Weighting (referred to as IDW), TRMM and the fused data (BDF). The result showed that BDF prediction was better compared to the TRMM and IDW. Further evaluation of the three rainfall estimates was done by evaluating the capability in predicting observed stream flow using a lumped conceptual rainfall-runoff model using NAM. Visual inspection of the

  9. Bayesian SPLDA

    OpenAIRE

    Villalba, Jesús

    2015-01-01

    In this document we are going to derive the equations needed to implement a Variational Bayes estimation of the parameters of the simplified probabilistic linear discriminant analysis (SPLDA) model. This can be used to adapt SPLDA from one database to another with few development data or to implement the fully Bayesian recipe. Our approach is similar to Bishop's VB PPCA.

  10. A user-friendly forest model with a multiplicative mathematical structure: a Bayesian approach to calibration

    Directory of Open Access Journals (Sweden)

    M. Bagnara

    2014-10-01

    Full Text Available Forest models are being increasingly used to study ecosystem functioning, through the reproduction of carbon fluxes and productivity in very different forests all over the world. Over the last two decades, the need for simple and "easy to use" models for practical applications, characterized by few parameters and equations, has become clear, and some have been developed for this purpose. These models aim to represent the main drivers underlying forest ecosystem processes while being applicable to the widest possible range of forest ecosystems. Recently, it has also become clear that model performance should not be assessed only in terms of accuracy of estimations and predictions, but also in terms of estimates of model uncertainties. Therefore, the Bayesian approach has increasingly been applied to calibrate forest models, with the aim of estimating the uncertainty of their results, and of comparing their performances. Some forest models, considered to be user-friendly, rely on a multiplicative or quasi-multiplicative mathematical structure, which is known to cause problems during the calibration process, mainly due to high correlations between parameters. In a Bayesian framework using a Markov Chain Monte Carlo sampling this is likely to impair the reaching of a proper convergence of the chains and the sampling from the correct posterior distribution. Here we show two methods to reach proper convergence when using a forest model with a multiplicative structure, applying different algorithms with different number of iterations during the Markov Chain Monte Carlo or a two-steps calibration. The results showed that recently proposed algorithms for adaptive calibration do not confer a clear advantage over the Metropolis–Hastings Random Walk algorithm for the forest model used here. Moreover, the calibration remains time consuming and mathematically difficult, so advantages of using a fast and user-friendly model can be lost due to the calibration

  11. Application of Bayesian Approach to Cost-Effectiveness Analysis of Antiviral Treatments in Chronic Hepatitis B

    Science.gov (United States)

    Zhang, Hua; Huo, Mingdong; Chao, Jianqian; Liu, Pei

    2016-01-01

    Background Hepatitis B virus (HBV) infection is a major problem for public health; timely antiviral treatment can significantly prevent the progression of liver damage from HBV by slowing down or stopping the virus from reproducing. In the study we applied Bayesian approach to cost-effectiveness analysis, using Markov Chain Monte Carlo (MCMC) simulation methods for the relevant evidence input into the model to evaluate cost-effectiveness of entecavir (ETV) and lamivudine (LVD) therapy for chronic hepatitis B (CHB) in Jiangsu, China, thus providing information to the public health system in the CHB therapy. Methods Eight-stage Markov model was developed, a hypothetical cohort of 35-year-old HBeAg-positive patients with CHB was entered into the model. Treatment regimens were LVD100mg daily and ETV 0.5 mg daily. The transition parameters were derived either from systematic reviews of the literature or from previous economic studies. The outcome measures were life-years, quality-adjusted lifeyears (QALYs), and expected costs associated with the treatments and disease progression. For the Bayesian models all the analysis was implemented by using WinBUGS version 1.4. Results Expected cost, life expectancy, QALYs decreased with age. Cost-effectiveness increased with age. Expected cost of ETV was less than LVD, while life expectancy and QALYs were higher than that of LVD, ETV strategy was more cost-effective. Costs and benefits of the Monte Carlo simulation were very close to the results of exact form among the group, but standard deviation of each group indicated there was a big difference between individual patients. Conclusions Compared with lamivudine, entecavir is the more cost-effective option. CHB patients should accept antiviral treatment as soon as possible as the lower age the more cost-effective. Monte Carlo simulation obtained costs and effectiveness distribution, indicate our Markov model is of good robustness. PMID:27574976

  12. Estimating epidemiological parameters for bovine tuberculosis in British cattle using a Bayesian partial-likelihood approach.

    Science.gov (United States)

    O'Hare, A; Orton, R J; Bessell, P R; Kao, R R

    2014-05-22

    Fitting models with Bayesian likelihood-based parameter inference is becoming increasingly important in infectious disease epidemiology. Detailed datasets present the opportunity to identify subsets of these data that capture important characteristics of the underlying epidemiology. One such dataset describes the epidemic of bovine tuberculosis (bTB) in British cattle, which is also an important exemplar of a disease with a wildlife reservoir (the Eurasian badger). Here, we evaluate a set of nested dynamic models of bTB transmission, including individual- and herd-level transmission heterogeneity and assuming minimal prior knowledge of the transmission and diagnostic test parameters. We performed a likelihood-based bootstrapping operation on the model to infer parameters based only on the recorded numbers of cattle testing positive for bTB at the start of each herd outbreak considering high- and low-risk areas separately. Models without herd heterogeneity are preferred in both areas though there is some evidence for super-spreading cattle. Similar to previous studies, we found low test sensitivities and high within-herd basic reproduction numbers (R0), suggesting that there may be many unobserved infections in cattle, even though the current testing regime is sufficient to control within-herd epidemics in most cases. Compared with other, more data-heavy approaches, the summary data used in our approach are easily collected, making our approach attractive for other systems.

  13. A fuzzy Bayesian network approach to quantify the human behaviour during an evacuation

    Science.gov (United States)

    Ramli, Nurulhuda; Ghani, Noraida Abdul; Ahmad, Nazihah

    2016-06-01

    Bayesian Network (BN) has been regarded as a successful representation of inter-relationship of factors affecting human behavior during an emergency. This paper is an extension of earlier work of quantifying the variables involved in the BN model of human behavior during an evacuation using a well-known direct probability elicitation technique. To overcome judgment bias and reduce the expert's burden in providing precise probability values, a new approach for the elicitation technique is required. This study proposes a new fuzzy BN approach for quantifying human behavior during an evacuation. Three major phases of methodology are involved, namely 1) development of qualitative model representing human factors during an evacuation, 2) quantification of BN model using fuzzy probability and 3) inferencing and interpreting the BN result. A case study of three inter-dependencies of human evacuation factors such as danger assessment ability, information about the threat and stressful conditions are used to illustrate the application of the proposed method. This approach will serve as an alternative to the conventional probability elicitation technique in understanding the human behavior during an evacuation.

  14. Ensemble forecasting of sub-seasonal to seasonal streamflow by a Bayesian joint probability modelling approach

    Science.gov (United States)

    Zhao, Tongtiegang; Schepen, Andrew; Wang, Q. J.

    2016-10-01

    The Bayesian joint probability (BJP) modelling approach is used operationally to produce seasonal (three-month-total) ensemble streamflow forecasts in Australia. However, water resource managers are calling for more informative sub-seasonal forecasts. Taking advantage of BJP's capability of handling multiple predictands, ensemble forecasting of sub-seasonal to seasonal streamflows is investigated for 23 catchments around Australia. Using antecedent streamflow and climate indices as predictors, monthly forecasts are developed for the three-month period ahead. Forecast reliability and skill are evaluated for the period 1982-2011 using a rigorous leave-five-years-out cross validation strategy. BJP ensemble forecasts of monthly streamflow volumes are generally reliable in ensemble spread. Forecast skill, relative to climatology, is positive in 74% of cases in the first month, decreasing to 57% and 46% respectively for streamflow forecasts for the final two months of the season. As forecast skill diminishes with increasing lead time, the monthly forecasts approach climatology. Seasonal forecasts accumulated from monthly forecasts are found to be similarly skilful to forecasts from BJP models based on seasonal totals directly. The BJP modelling approach is demonstrated to be a viable option for producing ensemble time-series sub-seasonal to seasonal streamflow forecasts.

  15. A Bayesian Network Approach to Modeling Learning Progressions and Task Performance. CRESST Report 776

    Science.gov (United States)

    West, Patti; Rutstein, Daisy Wise; Mislevy, Robert J.; Liu, Junhui; Choi, Younyoung; Levy, Roy; Crawford, Aaron; DiCerbo, Kristen E.; Chappel, Kristina; Behrens, John T.

    2010-01-01

    A major issue in the study of learning progressions (LPs) is linking student performance on assessment tasks to the progressions. This report describes the challenges faced in making this linkage using Bayesian networks to model LPs in the field of computer networking. The ideas are illustrated with exemplar Bayesian networks built on Cisco…

  16. Cyclist activity and injury risk analysis at signalized intersections: a Bayesian modelling approach.

    Science.gov (United States)

    Strauss, Jillian; Miranda-Moreno, Luis F; Morency, Patrick

    2013-10-01

    This study proposes a two-equation Bayesian modelling approach to simultaneously study cyclist injury occurrence and bicycle activity at signalized intersections as joint outcomes. This approach deals with the potential presence of endogeneity and unobserved heterogeneities and is used to identify factors associated with both cyclist injuries and volumes. Its application to identify high-risk corridors is also illustrated. Montreal, Quebec, Canada is the application environment, using an extensive inventory of a large sample of signalized intersections containing disaggregate motor-vehicle traffic volumes and bicycle flows, geometric design, traffic control and built environment characteristics in the vicinity of the intersections. Cyclist injury data for the period of 2003-2008 is used in this study. Also, manual bicycle counts were standardized using temporal and weather adjustment factors to obtain average annual daily volumes. Results confirm and quantify the effects of both bicycle and motor-vehicle flows on cyclist injury occurrence. Accordingly, more cyclists at an intersection translate into more cyclist injuries but lower injury rates due to the non-linear association between bicycle volume and injury occurrence. Furthermore, the results emphasize the importance of turning motor-vehicle movements. The presence of bus stops and total crosswalk length increase cyclist injury occurrence whereas the presence of a raised median has the opposite effect. Bicycle activity through intersections was found to increase as employment, number of metro stations, land use mix, area of commercial land use type, length of bicycle facilities and the presence of schools within 50-800 m of the intersection increase. Intersections with three approaches are expected to have fewer cyclists than those with four. Using Bayesian analysis, expected injury frequency and injury rates were estimated for each intersection and used to rank corridors. Corridors with high bicycle volumes

  17. A Bayesian approach to utilizing prior data in new drug development.

    Science.gov (United States)

    Shen, Larry Z; Coffey, Todd; Deng, Wei

    2008-01-01

    In this paper we propose a Bayesian method to combine safety data collected from two separate drug development programs using the same active drug substance but for different indications, formulations, or patient populations. The objective of combining the data across the programs is to better define the level of safety risk associated with the new indication or target population. There may be adverse events (AEs) observed in the new program that represent new safety signals. Our method is to explore the AEs using data from both development programs. Our approach utilizes data collected previously to assist in analyzing safety data from the new program. It is assumed that the frequency of a certain AE follows a distribution with a parameter that characterizes the safety risk level. The parameter is assumed to follow a distribution function. In the Bayesian framework, this distribution function is called a prior distribution in the absence of data and posterior distribution when updated by real data. The key concept behind our method is to use data from the previous program to construct a posterior distribution that will in turn serve as a prior distribution for the new program. The construction of this updated prior down weights data from the previous program to emphasize the new program and thus avoids simple pooling of the data across programs. Such "soft use" of previous information minimizes the potential for undue influence of previous data on the analysis. Data from the new program are used to update the prior distribution and compute the posterior distribution for the new program. Key statistics are then calculated from the posterior distribution to quantify the risk level for the new program. We have tested the proposed approach using data from a real Phase 2 study that was conducted as part of a clinical development program for a new indication of an approved drug. The results indicate that the estimated risk level was affected both by the observed event

  18. A Bayesian Game-Theoretic Approach for Distributed Resource Allocation in Fading Multiple Access Channels

    Directory of Open Access Journals (Sweden)

    Gaoning He

    2010-01-01

    Full Text Available A Bayesian game-theoretic model is developed to design and analyze the resource allocation problem in K-user fading multiple access channels (MACs, where the users are assumed to selfishly maximize their average achievable rates with incomplete information about the fading channel gains. In such a game-theoretic study, the central question is whether a Bayesian equilibrium exists, and if so, whether the network operates efficiently at the equilibrium point. We prove that there exists exactly one Bayesian equilibrium in our game. Furthermore, we study the network sum-rate maximization problem by assuming that the users coordinate according to a symmetric strategy profile. This result also serves as an upper bound for the Bayesian equilibrium. Finally, simulation results are provided to show the network efficiency at the unique Bayesian equilibrium and to compare it with other strategies.

  19. Optimization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    Science.gov (United States)

    Fu, Zhibiao; Leighton, Julie; Cheng, Aili; Appelbaum, Edward; Aon, Juan C

    2012-07-01

    Various approaches have been applied to optimize biological product fermentation processes and define design space. In this article, we present a stepwise approach to optimize a Saccharomyces cerevisiae fermentation process through risk assessment analysis, statistical design of experiments (DoE), and multivariate Bayesian predictive approach. The critical process parameters (CPPs) were first identified through a risk assessment. The response surface for each attribute was modeled using the results from the DoE study with consideration given to interactions between CPPs. A multivariate Bayesian predictive approach was then used to identify the region of process operating conditions where all attributes met their specifications simultaneously. The model prediction was verified by twelve consistency runs where all batches achieved broth titer more than 1.53 g/L of broth and quality attributes within the expected ranges. The calculated probability was used to define the reliable operating region. To our knowledge, this is the first case study to implement the multivariate Bayesian predictive approach to the process optimization for the industrial application and its corresponding verification at two different production scales. This approach can be extended to other fermentation process optimizations and reliable operating region quantitation.

  20. A multinomial logit model-Bayesian network hybrid approach for driver injury severity analyses in rear-end crashes.

    Science.gov (United States)

    Chen, Cong; Zhang, Guohui; Tarefder, Rafiqul; Ma, Jianming; Wei, Heng; Guan, Hongzhi

    2015-07-01

    Rear-end crash is one of the most common types of traffic crashes in the U.S. A good understanding of its characteristics and contributing factors is of practical importance. Previously, both multinomial Logit models and Bayesian network methods have been used in crash modeling and analysis, respectively, although each of them has its own application restrictions and limitations. In this study, a hybrid approach is developed to combine multinomial logit models and Bayesian network methods for comprehensively analyzing driver injury severities in rear-end crashes based on state-wide crash data collected in New Mexico from 2010 to 2011. A multinomial logit model is developed to investigate and identify significant contributing factors for rear-end crash driver injury severities classified into three categories: no injury, injury, and fatality. Then, the identified significant factors are utilized to establish a Bayesian network to explicitly formulate statistical associations between injury severity outcomes and explanatory attributes, including driver behavior, demographic features, vehicle factors, geometric and environmental characteristics, etc. The test results demonstrate that the proposed hybrid approach performs reasonably well. The Bayesian network reference analyses indicate that the factors including truck-involvement, inferior lighting conditions, windy weather conditions, the number of vehicles involved, etc. could significantly increase driver injury severities in rear-end crashes. The developed methodology and estimation results provide insights for developing effective countermeasures to reduce rear-end crash injury severities and improve traffic system safety performance.

  1. Using Bayesian network and AHP method as a marketing approach tools in defining tourists’ preferences

    Directory of Open Access Journals (Sweden)

    Nataša Papić-Blagojević

    2012-04-01

    Full Text Available Marketing approach is associated to market conditions and achieving long term profitability of a company by satisfying consumers’ needs. This approach in tourism does not have to be related only to promoting one touristic destination, but is associated to relation between travel agency and its clients too. It considers that travel agencies adjust their offers to their clients’ needs. In that sense, it is important to analyze the behavior of tourists in the earlier periods with consideration of their preferences. Using Bayesian network, it could be graphically displayed the connection between tourists who have similar taste and relationships between them. On the other hand, the analytic hierarchy process (AHP is used to rank tourist attractions, with also relying on past experience. In this paper we examine possible applications of these two models in tourism in Serbia. The example is hypothetical, but it will serve as a base for future research. Three types of tourism are chosen as a representative in Vojvodina: Cultural, Rural and Business tourism, because they are the bright spot of touristic development in this area. Applied on these forms, analytic hierarchy process has shown its strength in predicting tourists’ preferences.

  2. Robust modeling of differential gene expression data using normal/independent distributions: a Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Mojtaba Ganjali

    Full Text Available In this paper, the problem of identifying differentially expressed genes under different conditions using gene expression microarray data, in the presence of outliers, is discussed. For this purpose, the robust modeling of gene expression data using some powerful distributions known as normal/independent distributions is considered. These distributions include the Student's t and normal distributions which have been used previously, but also include extensions such as the slash, the contaminated normal and the Laplace distributions. The purpose of this paper is to identify differentially expressed genes by considering these distributional assumptions instead of the normal distribution. A Bayesian approach using the Markov Chain Monte Carlo method is adopted for parameter estimation. Two publicly available gene expression data sets are analyzed using the proposed approach. The use of the robust models for detecting differentially expressed genes is investigated. This investigation shows that the choice of model for differentiating gene expression data is very important. This is due to the small number of replicates for each gene and the existence of outlying data. Comparison of the performance of these models is made using different statistical criteria and the ROC curve. The method is illustrated using some simulation studies. We demonstrate the flexibility of these robust models in identifying differentially expressed genes.

  3. Applications of Bayesian approach in modelling risk of malaria-related hospital mortality

    Directory of Open Access Journals (Sweden)

    Simbeye Jupiter S

    2008-02-01

    Full Text Available Abstract Background Malaria is a major public health problem in Malawi, however, quantifying its burden in a population is a challenge. Routine hospital data provide a proxy for measuring the incidence of severe malaria and for crudely estimating morbidity rates. Using such data, this paper proposes a method to describe trends, patterns and factors associated with in-hospital mortality attributed to the disease. Methods We develop semiparametric regression models which allow joint analysis of nonlinear effects of calendar time and continuous covariates, spatially structured variation, unstructured heterogeneity, and other fixed covariates. Modelling and inference use the fully Bayesian approach via Markov Chain Monte Carlo (MCMC simulation techniques. The methodology is applied to analyse data arising from paediatric wards in Zomba district, Malawi, between 2002 and 2003. Results and Conclusion We observe that the risk of dying in hospital is lower in the dry season, and for children who travel a distance of less than 5 kms to the hospital, but increases for those who are referred to the hospital. The results also indicate significant differences in both structured and unstructured spatial effects, and the health facility effects reveal considerable differences by type of facility or practice. More importantly, our approach shows non-linearities in the effect of metrical covariates on the probability of dying in hospital. The study emphasizes that the methodological framework used provides a useful tool for analysing the data at hand and of similar structure.

  4. A 1-step Bayesian predictive approach for evaluating in vitro in vivo correlation (IVIVC).

    Science.gov (United States)

    Gould, A Lawrence; Agrawal, Nancy G B; Goel, Thanh V; Fitzpatrick, Shaun

    2009-10-01

    IVIVC (in vitro in vivo correlation) methods may support approving a change in formulation of a drug using only in vitro dissolution data without additional bioequivalence trials in human subjects. Most current IVIVC methods express the in vivo plasma concentration of a drug formulation as a function of the cumulative in vivo absorption. The absorption is not directly observable, so is estimated by the cumulative dissolution of the drug formulation in in vitro dissolution trials. The calculations conventionally entail the complex and potentially unstable mathematical operations of convolution and deconvolution, or approximations aimed at omitting their need. This paper describes, and illustrates with data on a controlled-release formulation, a Bayesian approach to evaluating IVIVC that does not require convolution, deconvolution or approximation. This approach incorporates between- and within-subject (or replicate) variability without assuming asymptotic normality. The plasma concentration curve is expressed in terms of the in vitro dissolution percentage instead of time, recognizing that this correspondence may be noisy because of the various sources of error. All conventional functions of the concentration curve such as AUC, C(max) and T(max) can be expressed in terms of dissolution percentage, with uncertainties arising from variability in measuring absorption and dissolution accounted for explicitly.

  5. Modeling attainment of steady state of drug concentration in plasma by means of a Bayesian approach using MCMC methods.

    Science.gov (United States)

    Jordan, Paul; Brunschwig, Hadassa; Luedin, Eric

    2008-01-01

    The approach of Bayesian mixed effects modeling is an appropriate method for estimating both population-specific as well as subject-specific times to steady state. In addition to pure estimation, the approach allows to determine the time until a certain fraction of individuals of a population has reached steady state with a pre-specified certainty. In this paper a mixed effects model for the parameters of a nonlinear pharmacokinetic model is used within a Bayesian framework. Model fitting by means of Markov Chain Monte Carlo methods as implemented in the Gibbs sampler as well as the extraction of estimates and probability statements of interest are described. Finally, the proposed approach is illustrated by application to trough data from a multiple dose clinical trial.

  6. Comparison of Bayesian and frequentist approaches in modelling risk of preterm birth near the Sydney Tar Ponds, Nova Scotia, Canada

    Directory of Open Access Journals (Sweden)

    Canty Angelo

    2007-09-01

    Full Text Available Abstract Background This study compares the Bayesian and frequentist (non-Bayesian approaches in the modelling of the association between the risk of preterm birth and maternal proximity to hazardous waste and pollution from the Sydney Tar Pond site in Nova Scotia, Canada. Methods The data includes 1604 observed cases of preterm birth out of a total population of 17559 at risk of preterm birth from 144 enumeration districts in the Cape Breton Regional Municipality. Other covariates include the distance from the Tar Pond; the rate of unemployment to population; the proportion of persons who are separated, divorced or widowed; the proportion of persons who have no high school diploma; the proportion of persons living alone; the proportion of single parent families and average income. Bayesian hierarchical Poisson regression, quasi-likelihood Poisson regression and weighted linear regression models were fitted to the data. Results The results of the analyses were compared together with their limitations. Conclusion The results of the weighted linear regression and the quasi-likelihood Poisson regression agrees with the result from the Bayesian hierarchical modelling which incorporates the spatial effects.

  7. A Parallel and Incremental Approach for Data-Intensive Learning of Bayesian Networks.

    Science.gov (United States)

    Yue, Kun; Fang, Qiyu; Wang, Xiaoling; Li, Jin; Liu, Weiyi

    2015-12-01

    Bayesian network (BN) has been adopted as the underlying model for representing and inferring uncertain knowledge. As the basis of realistic applications centered on probabilistic inferences, learning a BN from data is a critical subject of machine learning, artificial intelligence, and big data paradigms. Currently, it is necessary to extend the classical methods for learning BNs with respect to data-intensive computing or in cloud environments. In this paper, we propose a parallel and incremental approach for data-intensive learning of BNs from massive, distributed, and dynamically changing data by extending the classical scoring and search algorithm and using MapReduce. First, we adopt the minimum description length as the scoring metric and give the two-pass MapReduce-based algorithms for computing the required marginal probabilities and scoring the candidate graphical model from sample data. Then, we give the corresponding strategy for extending the classical hill-climbing algorithm to obtain the optimal structure, as well as that for storing a BN by pairs. Further, in view of the dynamic characteristics of the changing data, we give the concept of influence degree to measure the coincidence of the current BN with new data, and then propose the corresponding two-pass MapReduce-based algorithms for BNs incremental learning. Experimental results show the efficiency, scalability, and effectiveness of our methods.

  8. Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach

    Science.gov (United States)

    Schumacher, Thomas; Straub, Daniel; Higgins, Christopher

    2012-09-01

    Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.

  9. Genetic parameters for carcass traits and body weight using a Bayesian approach in the Canchim cattle.

    Science.gov (United States)

    Meirelles, S L C; Mokry, F B; Espasandín, A C; Dias, M A D; Baena, M M; de A Regitano, L C

    2016-06-10

    Correlation between genetic parameters and factors such as backfat thickness (BFT), rib eye area (REA), and body weight (BW) were estimated for Canchim beef cattle raised in natural pastures of Brazil. Data from 1648 animals were analyzed using multi-trait (BFT, REA, and BW) animal models by the Bayesian approach. This model included the effects of contemporary group, age, and individual heterozygosity as covariates. In addition, direct additive genetic and random residual effects were also analyzed. Heritability estimated for BFT (0.16), REA (0.50), and BW (0.44) indicated their potential for genetic improvements and response to selection processes. Furthermore, genetic correlations between BW and the remaining traits were high (P > 0.50), suggesting that selection for BW could improve REA and BFT. On the other hand, genetic correlation between BFT and REA was low (P = 0.39 ± 0.17), and included considerable variations, suggesting that these traits can be jointly included as selection criteria without influencing each other. We found that REA and BFT responded to the selection processes, as measured by ultrasound. Therefore, selection for yearling weight results in changes in REA and BFT.

  10. Adjusting for differential-verification bias in diagnostic-accuracy studies: a Bayesian approach.

    Science.gov (United States)

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Bossuyt, Patrick M M; Moons, Karel G M

    2011-03-01

    In studies of diagnostic accuracy, the performance of an index test is assessed by verifying its results against those of a reference standard. If verification of index-test results by the preferred reference standard can be performed only in a subset of subjects, an alternative reference test could be given to the remainder. The drawback of this so-called differential-verification design is that the second reference test is often of lesser quality, or defines the target condition in a different way. Incorrectly treating results of the 2 reference standards as equivalent will lead to differential-verification bias. The Bayesian methods presented in this paper use a single model to (1) acknowledge the different nature of the 2 reference standards and (2) make simultaneous inferences about the population prevalence and the sensitivity, specificity, and predictive values of the index test with respect to both reference tests, in relation to latent disease status. We illustrate this approach using data from a study on the accuracy of the elbow extension test for diagnosis of elbow fractures in patients with elbow injury, using either radiography or follow-up as reference standards.

  11. A Bayesian Inferential Approach to Quantify the Transmission Intensity of Disease Outbreak

    Directory of Open Access Journals (Sweden)

    Adiveppa S. Kadi

    2015-01-01

    Full Text Available Background. Emergence of infectious diseases like influenza pandemic (H1N1 2009 has become great concern, which posed new challenges to the health authorities worldwide. To control these diseases various studies have been developed in the field of mathematical modelling, which is useful tool for understanding the epidemiological dynamics and their dependence on social mixing patterns. Method. We have used Bayesian approach to quantify the disease outbreak through key epidemiological parameter basic reproduction number (R0, using effective contacts, defined as sum of the product of incidence cases and probability of generation time distribution. We have estimated R0 from daily case incidence data for pandemic influenza A/H1N1 2009 in India, for the initial phase. Result. The estimated R0 with 95% credible interval is consistent with several other studies on the same strain. Through sensitivity analysis our study indicates that infectiousness affects the estimate of R0. Conclusion. Basic reproduction number R0 provides the useful information to the public health system to do some effort in controlling the disease by using mitigation strategies like vaccination, quarantine, and so forth.

  12. A hybrid dynamic Bayesian network approach for modelling temporal associations of gene expressions for hypertension diagnosis.

    Science.gov (United States)

    Akutekwe, Arinze; Seker, Huseyin

    2014-01-01

    Computational and machine learning techniques have been applied in identifying biomarkers and constructing predictive models for diagnosis of hypertension. Strategies such as improved classification rules based on decision trees have been proposed. Other techniques such as Fuzzy Expert Systems (FES) and Neuro-Fuzzy Systems (NFS) have recently been applied. However, these methods lack the ability to detect temporal relationships among biomarker genes that will aid better understanding of the mechanism of hypertension disease. In this paper we apply a proposed two-stage bio-network construction approach that combines the power and computational efficiency of classification methods with the well-established predictive ability of Dynamic Bayesian Network. We demonstrate our method using the analysis of male young-onset hypertension microarray dataset. Four key genes were identified by the Least Angle Shrinkage and Selection Operator (LASSO) and three Support Vector Machine Recursive Feature Elimination (SVM-RFE) methods. Results show that cell regulation FOXQ1 may inhibit the expression of focusyltransferase-6 (FUT6) and that ABCG1 ATP-binding cassette sub-family G may also play inhibitory role against NR2E3 nuclear receptor sub-family 2 and CGB2 Chromatin Gonadotrophin.

  13. A Bayesian Approach to the Design and Analysis of Computer Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Currin, C.

    1988-01-01

    We consider the problem of designing and analyzing experiments for prediction of the function y(f), t {element_of} T, where y is evaluated by means of a computer code (typically by solving complicated equations that model a physical system), and T represents the domain of inputs to the code. We use a Bayesian approach, in which uncertainty about y is represented by a spatial stochastic process (random function); here we restrict attention to stationary Gaussian processes. The posterior mean function can be used as an interpolating function, with uncertainties given by the posterior standard deviations. Instead of completely specifying the prior process, we consider several families of priors, and suggest some cross-validational methods for choosing one that performs relatively well on the function at hand. As a design criterion, we use the expected reduction in the entropy of the random vector y (T*), where T* {contained_in} T is a given finite set of ''sites'' (input configurations) at which predictions are to be made. We describe an exchange algorithm for constructing designs that are optimal with respect to this criterion. To demonstrate the use of these design and analysis methods, several examples are given, including one experiment on a computer model of a thermal energy storage device and another on an integrated circuit simulator.

  14. A Robust Bayesian Approach to an Optimal Replacement Policy for Gas Pipelines

    Directory of Open Access Journals (Sweden)

    José Pablo Arias-Nicolás

    2015-06-01

    Full Text Available In the paper, we address Bayesian sensitivity issues when integrating experts’ judgments with available historical data in a case study about strategies for the preventive maintenance of low-pressure cast iron pipelines in an urban gas distribution network. We are interested in replacement priorities, as determined by the failure rates of pipelines deployed under different conditions. We relax the assumptions, made in previous papers, about the prior distributions on the failure rates and study changes in replacement priorities under different choices of generalized moment-constrained classes of priors. We focus on the set of non-dominated actions, and among them, we propose the least sensitive action as the optimal choice to rank different classes of pipelines, providing a sound approach to the sensitivity problem. Moreover, we are also interested in determining which classes have a failure rate exceeding a given acceptable value, considered as the threshold determining no need for replacement. Graphical tools are introduced to help decisionmakers to determine if pipelines are to be replaced and the corresponding priorities.

  15. Carbon isotope discrimination during branch photosynthesis of Fagus sylvatica: a Bayesian modelling approach.

    Science.gov (United States)

    Gentsch, Lydia; Hammerle, Albin; Sturm, Patrick; Ogée, Jérôme; Wingate, Lisa; Siegwolf, Rolf; Plüss, Peter; Baur, Thomas; Buchmann, Nina; Knohl, Alexander

    2014-07-01

    Field measurements of photosynthetic carbon isotope discrimination ((13)Δ) of Fagus sylvatica, conducted with branch bags and laser spectrometry, revealed a high variability of (13)Δ, both on diurnal and day-to-day timescales. We tested the prediction capability of three versions of a commonly used model for (13)Δ [called here comprehensive ((13)(Δcomp)), simplified ((13) Δsimple) and revised ((13)(Δrevised)) versions]. A Bayesian approach was used to calibrate major model parameters. Constrained estimates were found for the fractionation during CO(2) fixation in (13)(Δcomp), but not in (13)(Δsimple), and partially for the mesophyll conductance for CO(2)(gi). No constrained estimates were found for fractionations during mitochondrial and photorespiration, and for a diurnally variable apparent fractionation between current assimilates and mitochondrial respiration, specific to (13)(Δrevised). A quantification of parameter estimation uncertainties and interdependencies further helped explore model structure and behaviour. We found that (13)(Δcomp) usually outperformed (13)(Δsimple) because of the explicit consideration of gi and the photorespiratory fractionation in (13)(Δcomp) that enabled a better description of the large observed diurnal variation (≈9‰) of (13)Δ. Flux-weighted daily means of (13)Δ were also better predicted with (13)(Δcomp) than with (13)(Δsimple).

  16. Monitoring schistosomiasis risk in East China over space and time using a Bayesian hierarchical modeling approach.

    Science.gov (United States)

    Hu, Yi; Ward, Michael P; Xia, Congcong; Li, Rui; Sun, Liqian; Lynn, Henry; Gao, Fenghua; Wang, Qizhi; Zhang, Shiqing; Xiong, Chenglong; Zhang, Zhijie; Jiang, Qingwu

    2016-04-07

    Schistosomiasis remains a major public health problem and causes substantial economic impact in east China, particularly along the Yangtze River Basin. Disease forecasting and surveillance can assist in the development and implementation of more effective intervention measures to control disease. In this study, we applied a Bayesian hierarchical spatio-temporal model to describe trends in schistosomiasis risk in Anhui Province, China, using annual parasitological and environmental data for the period 1997-2010. A computationally efficient approach-Integrated Nested Laplace Approximation-was used for model inference. A zero-inflated, negative binomial model best described the spatio-temporal dynamics of schistosomiasis risk. It predicted that the disease risk would generally be low and stable except for some specific, local areas during the period 2011-2014. High-risk counties were identified in the forecasting maps: three in which the risk remained high, and two in which risk would become high. The results indicated that schistosomiasis risk has been reduced to consistently low levels throughout much of this region of China; however, some counties were identified in which progress in schistosomiasis control was less than satisfactory. Whilst maintaining overall control, specific interventions in the future should focus on these refractive counties as part of a strategy to eliminate schistosomiasis from this region.

  17. Risk assessment of pre-hospital trauma airway management by anaesthesiologists using the predictive Bayesian approach

    Directory of Open Access Journals (Sweden)

    Nakstad Anders R

    2010-04-01

    Full Text Available Abstract Introduction Endotracheal intubation (ETI has been considered an essential part of pre-hospital advanced life support. Pre-hospital ETI, however, is a complex intervention also for airway specialist like anaesthesiologists working as pre-hospital emergency physicians. We therefore wanted to investigate the quality of pre-hospital airway management by anaesthesiologists in severely traumatised patients and identify possible areas for improvement. Method We performed a risk assessment according to the predictive Bayesian approach, in a typical anaesthesiologist-manned Norwegian helicopter emergency medical service (HEMS. The main focus of the risk assessment was the event where a patient arrives in the emergency department without ETI despite a pre-hospital indication for it. Results In the risk assessment, we assigned a high probability (29% for the event assessed, that a patient arrives without ETI despite a pre-hospital indication. However, several uncertainty factors in the risk assessment were identified related to data quality, indications for use of ETI, patient outcome and need for special training of ETI providers. Conclusion Our risk assessment indicated a high probability for trauma patients with an indication for pre-hospital ETI not receiving it in the studied HEMS. The uncertainty factors identified in the assessment should be further investigated to better understand the problem assessed and consequences for the patients. Better quality of pre-hospital airway management data could contribute to a reduction of these uncertainties.

  18. Receiver-based recovery of clipped ofdm signals for papr reduction: A bayesian approach

    KAUST Repository

    Ali, Anum

    2014-01-01

    Clipping is one of the simplest peak-to-average power ratio reduction schemes for orthogonal frequency division multiplexing (OFDM). Deliberately clipping the transmission signal degrades system performance, and clipping mitigation is required at the receiver for information restoration. In this paper, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the sparsity rate and noise variance for enhanced recovery. At the same time, the proposed scheme is robust against inaccurate estimates of the clipping signal statistics. The undistorted phase property of the clipped signal, as well as the clipping likelihood, is utilized for enhanced reconstruction. Furthermore, motivated by the nature of modern OFDM-based communication systems, we extend our clipping reconstruction approach to multiple antenna receivers and multi-user OFDM.We also address the problem of channel estimation from pilots contaminated by the clipping distortion. Numerical findings are presented that depict favorable results for the proposed scheme compared to the established sparse reconstruction schemes.

  19. Fast Bayesian approach for modal identification using free vibration data, Part I - Most probable value

    Science.gov (United States)

    Zhang, Feng-Liang; Ni, Yan-Chun; Au, Siu-Kui; Lam, Heung-Fai

    2016-03-01

    The identification of modal properties from field testing of civil engineering structures is becoming economically viable, thanks to the advent of modern sensor and data acquisition technology. Its demand is driven by innovative structural designs and increased performance requirements of dynamic-prone structures that call for a close cross-checking or monitoring of their dynamic properties and responses. Existing instrumentation capabilities and modal identification techniques allow structures to be tested under free vibration, forced vibration (known input) or ambient vibration (unknown broadband loading). These tests can be considered complementary rather than competing as they are based on different modeling assumptions in the identification model and have different implications on costs and benefits. Uncertainty arises naturally in the dynamic testing of structures due to measurement noise, sensor alignment error, modeling error, etc. This is especially relevant in field vibration tests because the test condition in the field environment can hardly be controlled. In this work, a Bayesian statistical approach is developed for modal identification using the free vibration response of structures. A frequency domain formulation is proposed that makes statistical inference based on the Fast Fourier Transform (FFT) of the data in a selected frequency band. This significantly simplifies the identification model because only the modes dominating the frequency band need to be included. It also legitimately ignores the information in the excluded frequency bands that are either irrelevant or difficult to model, thereby significantly reducing modeling error risk. The posterior probability density function (PDF) of the modal parameters is derived rigorously from modeling assumptions and Bayesian probability logic. Computational difficulties associated with calculating the posterior statistics, including the most probable value (MPV) and the posterior covariance matrix

  20. An Integrated Approach to Battery Health Monitoring using Bayesian Regression, Classification and State Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — The application of the Bayesian theory of managing uncertainty and complexity to regression and classification in the form of Relevance Vector Machine (RVM), and to...

  1. Bayesian and L$_\\mathbf{1}$ Approaches to Sparse Unsupervised Learning

    CERN Document Server

    Mohamed, Shakir; Ghahramani, Zoubin

    2011-01-01

    The use of $L_1$ regularisation for sparse learning has generated immense research interest, with successful application in such diverse areas as signal acquisition, image coding, genomics and collaborative filtering. While existing work highlights the many advantages of $L_1$ methods, in this paper we find that $L_1$ regularisation often dramatically underperforms in terms of predictive performance when compared with other methods for inferring sparsity. We focus on unsupervised latent variable models, and develop $L_1$ minimising factor models, Bayesian variants of "$L_1$", and Bayesian models with a stronger $L_0$-like sparsity induced through spike-and-slab distributions. These spike-and-slab Bayesian factor models encourage sparsity while accounting for uncertainty in a principled manner and avoiding unnecessary shrinkage of non-zero values. We demonstrate on a number of data sets that in practice spike-and-slab Bayesian methods outperform $L_1$ minimisation, even on a computational budget. We thus highl...

  2. Equifinality of formal (DREAM) and informal (GLUE) Bayesian approaches in hydrologic modeling?

    NARCIS (Netherlands)

    Vrugt, J.A.; Braak, ter C.J.F.; Gupta, H.V.; Robinson, B.A.

    2009-01-01

    In recent years, a strong debate has emerged in the hydrologic literature regarding what constitutes an appropriate framework for uncertainty estimation. Particularly, there is strong disagreement whether an uncertainty framework should have its roots within a proper statistical (Bayesian) context,

  3. Bayesian approach to the analysis of neutron Brillouin scattering data on liquid metals

    Science.gov (United States)

    De Francesco, A.; Guarini, E.; Bafile, U.; Formisano, F.; Scaccia, L.

    2016-08-01

    When the dynamics of liquids and disordered systems at mesoscopic level is investigated by means of inelastic scattering (e.g., neutron or x ray), spectra are often characterized by a poor definition of the excitation lines and spectroscopic features in general and one important issue is to establish how many of these lines need to be included in the modeling function and to estimate their parameters. Furthermore, when strongly damped excitations are present, commonly used and widespread fitting algorithms are particularly affected by the choice of initial values of the parameters. An inadequate choice may lead to an inefficient exploration of the parameter space, resulting in the algorithm getting stuck in a local minimum. In this paper, we present a Bayesian approach to the analysis of neutron Brillouin scattering data in which the number of excitation lines is treated as unknown and estimated along with the other model parameters. We propose a joint estimation procedure based on a reversible-jump Markov chain Monte Carlo algorithm, which efficiently explores the parameter space, producing a probabilistic measure to quantify the uncertainty on the number of excitation lines as well as reliable parameter estimates. The method proposed could turn out of great importance in extracting physical information from experimental data, especially when the detection of spectral features is complicated not only because of the properties of the sample, but also because of the limited instrumental resolution and count statistics. The approach is tested on generated data set and then applied to real experimental spectra of neutron Brillouin scattering from a liquid metal, previously analyzed in a more traditional way.

  4. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites.

    Science.gov (United States)

    Thomsen, Nanna I; Binning, Philip J; McKnight, Ursula S; Tuxen, Nina; Bjerg, Poul L; Troldborg, Mads

    2016-05-01

    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information

  5. Bayesian Approach for the Estimation of the Transmissivity Spatial Structure from Hydraulic Tomography Data

    Science.gov (United States)

    Demir, M. T.; Copty, N. K.; Trinchero, P.; Sanchez-Vila, X.

    2013-12-01

    Groundwater flow and contaminant transport are strongly influenced by the spatial variability of subsurface flow parameters. However, the interpretation of pumping test data used for subsurface characterization is normally performed using conventional methods that are based on the assumption of aquifer homogeneity. In recent years, hydraulic tomography has been proposed by some researchers to address the limitations of conventional site characterization methods. Hydraulic tomography involves the sequential pumping at one of a series of wells and observing the drawdown due to pumping at adjacent wells. The interpretation of the drawdown data from hydraulic tomography has been mostly performed using formal inverse procedures for the estimation of the spatial variability of the flow parameters. The purpose of this study is to develop a method for the estimation of the statistical spatial structure of the transmissivity from hydraulic tomography data. The method relies on the pumping test interpretation procedure of Copty et al. (2011), which uses the time-drawdown data and its time derivative at each observation well to estimate the spatially averaged transmissivity as a function of radial distance from the pumping well. A Bayesian approach is then used to identify the statistical parameters of the transmissivity field (i.e. variance and integral scale). The approach compares the estimated transmissivity as a function of radial distance from the pumping well to the probability density function of the spatially-averaged transmissivity. The method is evaluated using synthetically-generated pumping test data for a range of input parameters. This application demonstrates that, through a relatively simple procedure, additional information of the spatial structure of the transmissivity may be inferred from pumping tests data. Results indicate that as the number of available pumping tests increases, the reliability of the estimated transmissivity statistical parameters also

  6. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    Science.gov (United States)

    Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads

    2016-05-01

    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information

  7. Reconstruction of Layer Densities in a Multilayer Snowpack using a Bayesian Approach to Inverse Modeling

    Science.gov (United States)

    Aguayo, M.; Marshall, H.; McNamara, J. P.; Mead, J.; Flores, A. N.

    2013-12-01

    Estimation of snowpack parameters such as depth, density and grain structure is a central focus of hydrology in seasonally snow-covered lands. These parameters are directly estimated by field observations, indirectly estimated from other parameters using statistical correlations, or simulated with a model. Difficulty in sampling thin layers and uncertainty in the transition between layers can cause significant uncertainty in measurements of these parameters. Snow density is one of the most important parameters to measure because it is strictly related with snow water content, an important component of the global water balance. We develop a mathematical framework to estimate snow density from measurements of temperature and thickness of snowpack layers over a particular time period, in conjunction with a physics-based model of snowpack evolution. We formulate a Bayesian approach to estimate the snowpack density profile, using a full range of possible simulations that incorporate key sources of uncertainty to build in prior snowpack knowledge. The posterior probability density function of the snow density, conditioned on snowpack temperature measurements, is computed by multiplying the likelihoods and assumed prior distribution function. Random sampling is used to generate a range of densities with same probability when prior uniform probability function is assumed. A posterior probability density function calculated directly via Bayes' theorem is used to calculate the probability of every sample generated. The forward model is a 1D, multilayer snow energy and mass balance model, which solves for snow temperature, density, and liquid water content on a finite element mesh. The surface and ground temperature data of snowpack (boundary conditions), are provided by the Center for Snow and Avalanche Studies (CSAS), Silverton CO, from snow pits made at Swamp Angel and Senator Beck study plot sites. Standard errors between field observations and results computed denote the

  8. A multi-tissue type genome-scale metabolic network for analysis of whole-body systems physiology

    Directory of Open Access Journals (Sweden)

    Feist Adam M

    2011-10-01

    Full Text Available Abstract Background Genome-scale metabolic reconstructions provide a biologically meaningful mechanistic basis for the genotype-phenotype relationship. The global human metabolic network, termed Recon 1, has recently been reconstructed allowing the systems analysis of human metabolic physiology and pathology. Utilizing high-throughput data, Recon 1 has recently been tailored to different cells and tissues, including the liver, kidney, brain, and alveolar macrophage. These models have shown utility in the study of systems medicine. However, no integrated analysis between human tissues has been done. Results To describe tissue-specific functions, Recon 1 was tailored to describe metabolism in three human cells: adipocytes, hepatocytes, and myocytes. These cell-specific networks were manually curated and validated based on known cellular metabolic functions. To study intercellular interactions, a novel multi-tissue type modeling approach was developed to integrate the metabolic functions for the three cell types, and subsequently used to simulate known integrated metabolic cycles. In addition, the multi-tissue model was used to study diabetes: a pathology with systemic properties. High-throughput data was integrated with the network to determine differential metabolic activity between obese and type II obese gastric bypass patients in a whole-body context. Conclusion The multi-tissue type modeling approach presented provides a platform to study integrated metabolic states. As more cell and tissue-specific models are released, it is critical to develop a framework in which to study their interdependencies.

  9. A New Approach for Time Series Forecasting: Bayesian Enhanced by Fractional Brownian Motion with Application to Rainfall Series

    Directory of Open Access Journals (Sweden)

    Cristian Rodriguez Rivero

    2016-03-01

    Full Text Available A new predictor algorithm based on Bayesian enhanced approach (BEA for long-term chaotic time series using artificial neural networks (ANN is presented. The technique based on stochastic models uses Bayesian inference by means of Fractional Brownian Motion as model data and Beta model as prior information. However, the need of experimental data for specifying and estimating causal models has not changed. Indeed, Bayes method provides another way to incorporate prior knowledge in forecasting models; the simplest representations of prior knowledge in forecasting models are hard to beat in many forecasting situations, either because prior knowledge is insufficient to improve on models or because prior knowledge leads to the conclusion that the situation is stable. This work contributes with long-term time series prediction, to give forecast horizons up to 18 steps ahead. Thus, the forecasted values and validation data are presented by solutions of benchmark chaotic series such as Mackey-Glass, Lorenz, Henon, Logistic, Rössler, Ikeda, Quadratic one-dimensional map series and monthly cumulative rainfall collected from Despeñaderos, Cordoba, Argentina. The computational results are evaluated against several non-linear ANN predictors proposed before on high roughness series that shows a better performance of Bayesian Enhanced approach in long-term forecasting.

  10. Structural mapping in statistical word problems: A relational reasoning approach to Bayesian inference.

    Science.gov (United States)

    Johnson, Eric D; Tubau, Elisabet

    2016-09-27

    Presenting natural frequencies facilitates Bayesian inferences relative to using percentages. Nevertheless, many people, including highly educated and skilled reasoners, still fail to provide Bayesian responses to these computationally simple problems. We show that the complexity of relational reasoning (e.g., the structural mapping between the presented and requested relations) can help explain the remaining difficulties. With a non-Bayesian inference that required identical arithmetic but afforded a more direct structural mapping, performance was universally high. Furthermore, reducing the relational demands of the task through questions that directed reasoners to use the presented statistics, as compared with questions that prompted the representation of a second, similar sample, also significantly improved reasoning. Distinct error patterns were also observed between these presented- and similar-sample scenarios, which suggested differences in relational-reasoning strategies. On the other hand, while higher numeracy was associated with better Bayesian reasoning, higher-numerate reasoners were not immune to the relational complexity of the task. Together, these findings validate the relational-reasoning view of Bayesian problem solving and highlight the importance of considering not only the presented task structure, but also the complexity of the structural alignment between the presented and requested relations.

  11. Exploring Neighborhood Influences on Small-Area Variations in Intimate Partner Violence Risk: A Bayesian Random-Effects Modeling Approach

    Directory of Open Access Journals (Sweden)

    Enrique Gracia

    2014-01-01

    Full Text Available This paper uses spatial data of cases of intimate partner violence against women (IPVAW to examine neighborhood-level influences on small-area variations in IPVAW risk in a police district of the city of Valencia (Spain. To analyze area variations in IPVAW risk and its association with neighborhood-level explanatory variables we use a Bayesian spatial random-effects modeling approach, as well as disease mapping methods to represent risk probabilities in each area. Analyses show that IPVAW cases are more likely in areas of high immigrant concentration, high public disorder and crime, and high physical disorder. Results also show a spatial component indicating remaining variability attributable to spatially structured random effects. Bayesian spatial modeling offers a new perspective to identify IPVAW high and low risk areas, and provides a new avenue for the design of better-informed prevention and intervention strategies.

  12. Measurement of the Z and W production cross section in pp collisions at LHC using a bayesian approach

    CERN Document Server

    Ragoni, Simone

    The aim of all my work has been to compute the fiducial production cross sections of W± and Z0 bosons in their leptonic (e, µ) decays using the data collected by the ATLAS detector at LHC with a center of mass energy of √s = 13 TeV during summer 2015. The selected events are exactly the same as the ones employed by the recently published article by the ATLAS Collaboration over the same topic, enabling us to compare the obtained results. Necessary comparison, if I may, for the results were obtained with two different procedures: baseline (classical) for the article, bayesian in this thesis. The bayesian approach allows for a natural combination among the many channels and a straightforward treatment of the systematic uncertainties. The obtained results are in excellent agreement with the Standard Model predictions and those published by ATLAS.

  13. WebMOTIFS: automated discovery, filtering and scoring of DNA sequence motifs using multiple programs and Bayesian approaches.

    Science.gov (United States)

    Romer, Katherine A; Kayombya, Guy-Richard; Fraenkel, Ernest

    2007-07-01

    WebMOTIFS provides a web interface that facilitates the discovery and analysis of DNA-sequence motifs. Several studies have shown that the accuracy of motif discovery can be significantly improved by using multiple de novo motif discovery programs and using randomized control calculations to identify the most significant motifs or by using Bayesian approaches. WebMOTIFS makes it easy to apply these strategies. Using a single submission form, users can run several motif discovery programs and score, cluster and visualize the results. In addition, the Bayesian motif discovery program THEME can be used to determine the class of transcription factors that is most likely to regulate a set of sequences. Input can be provided as a list of gene or probe identifiers. Used with the default settings, WebMOTIFS accurately identifies biologically relevant motifs from diverse data in several species. WebMOTIFS is freely available at http://fraenkel.mit.edu/webmotifs.

  14. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping.

    Directory of Open Access Journals (Sweden)

    Chris Wallace

    2015-06-01

    Full Text Available Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS and type 1 diabetes (T1D associations in the IL-2RA (CD25 gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3 and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data.

  15. A Bayesian kriging approach for blending satellite and ground precipitation observations

    Science.gov (United States)

    Verdin, Andrew; Rajagopalan, Balaji; Kleiber, William; Funk, Chris

    2015-02-01

    Drought and flood management practices require accurate estimates of precipitation. Gauge observations, however, are often sparse in regions with complicated terrain, clustered in valleys, and of poor quality. Consequently, the spatial extent of wet events is poorly represented. Satellite-derived precipitation data are an attractive alternative, though they tend to underestimate the magnitude of wet events due to their dependency on retrieval algorithms and the indirect relationship between satellite infrared observations and precipitation intensities. Here we offer a Bayesian kriging approach for blending precipitation gauge data and the Climate Hazards Group Infrared Precipitation satellite-derived precipitation estimates for Central America, Colombia, and Venezuela. First, the gauge observations are modeled as a linear function of satellite-derived estimates and any number of other variables—for this research we include elevation. Prior distributions are defined for all model parameters and the posterior distributions are obtained simultaneously via Markov chain Monte Carlo sampling. The posterior distributions of these parameters are required for spatial estimation, and thus are obtained prior to implementing the spatial kriging model. This functional framework is applied to model parameters obtained by sampling from the posterior distributions, and the residuals of the linear model are subject to a spatial kriging model. Consequently, the posterior distributions and uncertainties of the blended precipitation estimates are obtained. We demonstrate this method by applying it to pentadal and monthly total precipitation fields during 2009. The model's performance and its inherent ability to capture wet events are investigated. We show that this blending method significantly improves upon the satellite-derived estimates and is also competitive in its ability to represent wet events. This procedure also provides a means to estimate a full conditional distribution

  16. A Bayesian Network Approach for Offshore Risk Analysis Through Linguistic Variables

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper presents a new approach for offshore risk analysis that is capable of dealing with linguistic probabilities in Bayesian networks (BNs). In this paper, linguistic probabilities are used to describe occurrence likelihood of hazardous events that may cause possible accidents in offshore operations. In order to use fuzzy information, an f-weighted valuation function is proposed to transform linguistic judgements into crisp probability distributions which can be easily put into a BN to model causal relationships among risk factors. The use of linguistic variables makes it easier for human experts to express their knowledge, and the transformation of linguistic judgements into crisp probabilities can significantly save the cost of computation, modifying and maintaining a BN model. The flexibility of the method allows for multiple forms of information to be used to quantify model relationships, including formally assessed expert opinion when quantitative data are lacking, or when only qualitative or vague statements can be made. The model is a modular representation of uncertain knowledge caused due to randomness, vagueness and ignorance. This makes the risk analysis of offshore engineering systems more functional and easier in many assessment contexts. Specifically, the proposed f-weighted valuation function takes into account not only the dominating values, but also the α-level values that are ignored by conventional valuation methods. A case study of the collision risk between a Floating Production, Storage and Off-loading (FPSO) unit and the authorised vessels due to human elements during operation is used to illustrate the application of the proposed model.

  17. BClass: A Bayesian Approach Based on Mixture Models for Clustering and Classification of Heterogeneous Biological Data

    Directory of Open Access Journals (Sweden)

    Arturo Medrano-Soto

    2004-12-01

    Full Text Available Based on mixture models, we present a Bayesian method (called BClass to classify biological entities (e.g. genes when variables of quite heterogeneous nature are analyzed. Various statistical distributions are used to model the continuous/categorical data commonly produced by genetic experiments and large-scale genomic projects. We calculate the posterior probability of each entry to belong to each element (group in the mixture. In this way, an original set of heterogeneous variables is transformed into a set of purely homogeneous characteristics represented by the probabilities of each entry to belong to the groups. The number of groups in the analysis is controlled dynamically by rendering the groups as 'alive' and 'dormant' depending upon the number of entities classified within them. Using standard Metropolis-Hastings and Gibbs sampling algorithms, we constructed a sampler to approximate posterior moments and grouping probabilities. Since this method does not require the definition of similarity measures, it is especially suitable for data mining and knowledge discovery in biological databases. We applied BClass to classify genes in RegulonDB, a database specialized in information about the transcriptional regulation of gene expression in the bacterium Escherichia coli. The classification obtained is consistent with current knowledge and allowed prediction of missing values for a number of genes. BClass is object-oriented and fully programmed in Lisp-Stat. The output grouping probabilities are analyzed and interpreted using graphical (dynamically linked plots and query-based approaches. We discuss the advantages of using Lisp-Stat as a programming language as well as the problems we faced when the data volume increased exponentially due to the ever-growing number of genomic projects.

  18. An unbiased Bayesian approach to functional connectomics implicates social-communication networks in autism.

    Science.gov (United States)

    Venkataraman, Archana; Duncan, James S; Yang, Daniel Y-J; Pelphrey, Kevin A

    2015-01-01

    Resting-state functional magnetic resonance imaging (rsfMRI) studies reveal a complex pattern of hyper- and hypo-connectivity in children with autism spectrum disorder (ASD). Whereas rsfMRI findings tend to implicate the default mode network and subcortical areas in ASD, task fMRI and behavioral experiments point to social dysfunction as a unifying impairment of the disorder. Here, we leverage a novel Bayesian framework for whole-brain functional connectomics that aggregates population differences in connectivity to localize a subset of foci that are most affected by ASD. Our approach is entirely data-driven and does not impose spatial constraints on the region foci or dictate the trajectory of altered functional pathways. We apply our method to data from the openly shared Autism Brain Imaging Data Exchange (ABIDE) and pinpoint two intrinsic functional networks that distinguish ASD patients from typically developing controls. One network involves foci in the right temporal pole, left posterior cingulate cortex, left supramarginal gyrus, and left middle temporal gyrus. Automated decoding of this network by the Neurosynth meta-analytic database suggests high-level concepts of "language" and "comprehension" as the likely functional correlates. The second network consists of the left banks of the superior temporal sulcus, right posterior superior temporal sulcus extending into temporo-parietal junction, and right middle temporal gyrus. Associated functionality of these regions includes "social" and "person". The abnormal pathways emanating from the above foci indicate that ASD patients simultaneously exhibit reduced long-range or inter-hemispheric connectivity and increased short-range or intra-hemispheric connectivity. Our findings reveal new insights into ASD and highlight possible neural mechanisms of the disorder.

  19. A New Approach for Obtaining Cosmological Constraints from Type Ia Supernovae using Approximate Bayesian Computation

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, Elise; Wolf, Rachel; Sako, Masao

    2016-11-09

    Cosmological parameter estimation techniques that robustly account for systematic measurement uncertainties will be crucial for the next generation of cosmological surveys. We present a new analysis method, superABC, for obtaining cosmological constraints from Type Ia supernova (SN Ia) light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. The ABC method works by using a forward model simulation of the data where systematic uncertainties can be simulated and marginalized over. A key feature of the method presented here is the use of two distinct metrics, the `Tripp' and `Light Curve' metrics, which allow us to compare the simulated data to the observed data set. The Tripp metric takes as input the parameters of models fit to each light curve with the SALT-II method, whereas the Light Curve metric uses the measured fluxes directly without model fitting. We apply the superABC sampler to a simulated data set of $\\sim$1000 SNe corresponding to the first season of the Dark Energy Survey Supernova Program. Varying $\\Omega_m, w_0, \\alpha$ and $\\beta$ and a magnitude offset parameter, with no systematics we obtain $\\Delta(w_0) = w_0^{\\rm true} - w_0^{\\rm best \\, fit} = -0.036\\pm0.109$ (a $\\sim11$% 1$\\sigma$ uncertainty) using the Tripp metric and $\\Delta(w_0) = -0.055\\pm0.068$ (a $\\sim7$% 1$\\sigma$ uncertainty) using the Light Curve metric. Including 1% calibration uncertainties in four passbands, adding 4 more parameters, we obtain $\\Delta(w_0) = -0.062\\pm0.132$ (a $\\sim14$% 1$\\sigma$ uncertainty) using the Tripp metric. Overall we find a $17$% increase in the uncertainty on $w_0$ with systematics compared to without. We contrast this with a MCMC approach where systematic effects are approximately included. We find that the MCMC method slightly underestimates the impact of calibration uncertainties for this simulated data set.

  20. A Nonparametric Bayesian Approach to Seismic Hazard Modeling Using the ETAS Framework

    Science.gov (United States)

    Ross, G.

    2015-12-01

    The epidemic-type aftershock sequence (ETAS) model is one of the most popular tools for modeling seismicity and quantifying risk in earthquake-prone regions. Under the ETAS model, the occurrence times of earthquakes are treated as a self-exciting Poisson process where each earthquake briefly increases the probability of subsequent earthquakes occurring soon afterwards, which captures the fact that large mainshocks tend to produce long sequences of aftershocks. A triggering kernel controls the amount by which the probability increases based on the magnitude of each earthquake, and the rate at which it then decays over time. This triggering kernel is usually chosen heuristically, to match the parametric form of the modified Omori law for aftershock decay. However recent work has questioned whether this is an appropriate choice. Since the choice of kernel has a large impact on the predictions made by the ETAS model, avoiding misspecification is crucially important. We present a novel nonparametric version of ETAS which avoids making parametric assumptions, and instead learns the correct specification from the data itself. Our approach is based on the Dirichlet process, which is a modern class of Bayesian prior distribution which allows for efficient inference over an infinite dimensional space of functions. We show how our nonparametric ETAS model can be fit to data, and present results demonstrating that the fit is greatly improved compared to the standard parametric specification. Additionally, we explain how our model can be used to perform probabilistic declustering of earthquake catalogs, to classify earthquakes as being either aftershocks or mainshocks. and to learn the causal relations between pairs of earthquakes.

  1. Neutron fluence rate measurements at an underground laboratory: A Bayesian approach

    Science.gov (United States)

    Reginatto, Marcel; Kasper, Angelika; Schuhmacher, Helmut; Wiegel, Burkhard; Zimbal, Andreas

    2013-08-01

    We describe the analysis of neutron fluence rate measurements that were carried out at the underground laboratory Felsenkeller, near Dresden, Germany, which is at a depth of 47 m. At this depth, neutrons are mainly produced by natural radioactivity via spontaneous fission and (α, n) reactions, and by reactions induced by cosmic-ray muons. The measurements were made with the NEMUS Bonner sphere spectrometer. This system consists of a set of moderating spheres of different diameters and a 3He-filled proportional counter placed at the center of each sphere. Due to time constraints, it was only possible to use three of the spheres and the "bare detector" (i.e., a 3He-filled proportional counter without a moderating sphere). In addition to the measurements carried out at Felsenkeller, we also made low-level measurements with a set of 3He-filled proportional counters in the UDO underground laboratory at the Asse salt mine, near Braunschweig, Germany, which is at a depth of 490 m. The neutron background at UDO is substantially lower than that at Felsenkeller and these data are useful for setting limits on the background of the 3He-filled proportional counters. To estimate the neutron fluence rate at Felsenkeller, we did an analysis which took into account the measurements at UDO, Felsenkeller, and calibration measurements made at our facility in PTB. The analysis was done using Bayesian parameter estimation. Since the data consisted of low-level measurements, careful attention was given to the modeling of the intrinsic background of the detector and to identifying relevant sources of uncertainty. With the approach developed here, it is possible to estimate the neutron fluence rate with a relatively small uncertainty of the order of 10%. The method should be useful for other underground laboratories.

  2. BAYESIAN APPROACH TO THE PROCESS OF IDENTIFICATION OF THE DETERMINANTS OF INNOVATIVENESS

    Directory of Open Access Journals (Sweden)

    Marta Czyżewska

    2014-08-01

    Full Text Available Bayesian belief networks are applied in determining the most important factors of the innovativeness level of national economies. The paper is divided into two parts. The first presentsthe basic theory of Bayesian networks whereas in the second, the belief networks have been generated by an inhouse developed computer system called BeliefSEEKER which was implemented to generate the determinants influencing the innovativeness level of national economies.Qualitative analysis of the generated belief networks provided a way to define a set of the most important dimensions influencing the innovativeness level of economies and then the indicators that form these dimensions. It has been proven that Bayesian networks are very effective methods for multidimensional analysis and forming conclusions and recommendations regarding the strength of each innovative determinant influencing the overall performance of a country’s economy.

  3. Predicting mTOR inhibitors with a classifier using recursive partitioning and Naive Bayesian approaches.

    Directory of Open Access Journals (Sweden)

    Ling Wang

    Full Text Available BACKGROUND: Mammalian target of rapamycin (mTOR is a central controller of cell growth, proliferation, metabolism, and angiogenesis. Thus, there is a great deal of interest in developing clinical drugs based on mTOR. In this paper, in silico models based on multi-scaffolds were developed to predict mTOR inhibitors or non-inhibitors. METHODS: First 1,264 diverse compounds were collected and categorized as mTOR inhibitors and non-inhibitors. Two methods, recursive partitioning (RP and naïve Bayesian (NB, were used to build combinatorial classification models of mTOR inhibitors versus non-inhibitors using physicochemical descriptors, fingerprints, and atom center fragments (ACFs. RESULTS: A total of 253 models were constructed and the overall predictive accuracies of the best models were more than 90% for both the training set of 964 and the external test set of 300 diverse compounds. The scaffold hopping abilities of the best models were successfully evaluated through predicting 37 new recently published mTOR inhibitors. Compared with the best RP and Bayesian models, the classifier based on ACFs and Bayesian shows comparable or slightly better in performance and scaffold hopping abilities. A web server was developed based on the ACFs and Bayesian method (http://rcdd.sysu.edu.cn/mtor/. This web server can be used to predict whether a compound is an mTOR inhibitor or non-inhibitor online. CONCLUSION: In silico models were constructed to predict mTOR inhibitors using recursive partitioning and naïve Bayesian methods, and a web server (mTOR Predictor was also developed based on the best model results. Compound prediction or virtual screening can be carried out through our web server. Moreover, the favorable and unfavorable fragments for mTOR inhibitors obtained from Bayesian classifiers will be helpful for lead optimization or the design of new mTOR inhibitors.

  4. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    Science.gov (United States)

    Iskandar, Ismed; Satria Gondokaryono, Yudi

    2016-02-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  5. Bayesian analysis of rotating machines - A statistical approach to estimate and track the fundamental frequency

    DEFF Research Database (Denmark)

    Pedersen, Thorkild Find

    2003-01-01

    Rotating and reciprocating mechanical machines emit acoustic noise and vibrations when they operate. Typically, the noise and vibrations are concentrated in narrow frequency bands related to the running speed of the machine. The frequency of the running speed is referred to as the fundamental...... of an adaptive comb filter is derived for tracking non-stationary signals. The estimation problem is then rephrased in terms of the Bayesian statistical framework. In the Bayesian framework both parameters and observations are considered stochastic processes. The result of the estimation is an expression...

  6. Extracting a Whisper from the DIN: A Bayesian-Inductive Approach to Learning an Anticipatory Model of Cavitation

    Energy Technology Data Exchange (ETDEWEB)

    Kercel, S.W.

    1999-11-07

    For several reasons, Bayesian parameter estimation is superior to other methods for inductively learning a model for an anticipatory system. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since "nuisance parameters" can be removed from the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit of perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with the imperfections of real-world data, Bayesian methods approach this ideal limit of performance more closely than other methods. These capabilities provide a strategy for addressing a major unsolved problem in pump operation: the identification of precursors of cavitation. Cavitation causes immediate degradation of pump performance and ultimate destruction of the pump. However, the most efficient point to operate a pump is just below the threshold of cavitation. It might be hoped that a straightforward method to minimize pump cavitation damage would be to simply adjust the operating point until the inception of cavitation is detected and then to slightly readjust the operating point to let the cavitation vanish. However, due to the continuously evolving state of the fluid moving through the pump, the threshold of cavitation tends to wander. What is needed is to anticipate cavitation, and this requires the detection and identification of precursor features that occur just before cavitation starts.

  7. Defining the hundred year flood: A Bayesian approach for using historic data to reduce uncertainty in flood frequency estimates

    Science.gov (United States)

    Parkes, Brandon; Demeritt, David

    2016-09-01

    This paper describes a Bayesian statistical model for estimating flood frequency by combining uncertain annual maximum (AMAX) data from a river gauge with estimates of flood peak discharge from various historic sources that predate the period of instrument records. Such historic flood records promise to expand the time series data needed for reducing the uncertainty in return period estimates for extreme events, but the heterogeneity and uncertainty of historic records make them difficult to use alongside Flood Estimation Handbook and other standard methods for generating flood frequency curves from gauge data. Using the flow of the River Eden in Carlisle, Cumbria, UK as a case study, this paper develops a Bayesian model for combining historic flood estimates since 1800 with gauge data since 1967 to estimate the probability of low frequency flood events for the area taking account of uncertainty in the discharge estimates. Results show a reduction in 95% confidence intervals of roughly 50% for annual exceedance probabilities of less than 0.0133 (return periods over 75 years) compared to standard flood frequency estimation methods using solely systematic data. Sensitivity analysis shows the model is sensitive to 2 model parameters both of which are concerned with the historic (pre-systematic) period of the time series. This highlights the importance of adequate consideration of historic channel and floodplain changes or possible bias in estimates of historic flood discharges. The next steps required to roll out this Bayesian approach for operational flood frequency estimation at other sites is also discussed.

  8. A Bayesian approach to analyze energy balance data from lactating dairy cows

    NARCIS (Netherlands)

    Strathe, A.B.; Dijkstra, J.; France, J.; Lopez, S.; Yan, T.; Kebreab, E.

    2011-01-01

    The objective of the present investigation was to develop a Bayesian framework for updating and integrating covariate information into key parameters of metabolizable energy (ME) systems for dairy cows. The study addressed specifically the effects of genetic improvements and feed quality on key para

  9. A Bayesian network approach for causal inferences in pesticide risk assessment and management

    Science.gov (United States)

    Pesticide risk assessment and management must balance societal benefits and ecosystem protection, based on quantified risks and the strength of the causal linkages between uses of the pesticide and socioeconomic and ecological endpoints of concern. A Bayesian network (BN) is a gr...

  10. A Bayesian Decision-Theoretic Approach to Logically-Consistent Hypothesis Testing

    Directory of Open Access Journals (Sweden)

    Gustavo Miranda da Silva

    2015-09-01

    Full Text Available This work addresses an important issue regarding the performance of simultaneous test procedures: the construction of multiple tests that at the same time are optimal from a statistical perspective and that also yield logically-consistent results that are easy to communicate to practitioners of statistical methods. For instance, if hypothesis A implies hypothesis B, is it possible to create optimal testing procedures that reject A whenever they reject B? Unfortunately, several standard testing procedures fail in having such logical consistency. Although this has been deeply investigated under a frequentist perspective, the literature lacks analyses under a Bayesian paradigm. In this work, we contribute to the discussion by investigating three rational relationships under a Bayesian decision-theoretic standpoint: coherence, invertibility and union consonance. We characterize and illustrate through simple examples optimal Bayes tests that fulfill each of these requisites separately. We also explore how far one can go by putting these requirements together. We show that although fairly intuitive tests satisfy both coherence and invertibility, no Bayesian testing scheme meets the desiderata as a whole, strengthening the understanding that logical consistency cannot be combined with statistical optimality in general. Finally, we associate Bayesian hypothesis testing with Bayes point estimation procedures. We prove the performance of logically-consistent hypothesis testing by means of a Bayes point estimator to be optimal only under very restrictive conditions.

  11. Understanding the Uncertainty of an Effectiveness-Cost Ratio in Educational Resource Allocation: A Bayesian Approach

    Science.gov (United States)

    Pan, Yilin

    2016-01-01

    Given the necessity to bridge the gap between what happened and what is likely to happen, this paper aims to explore how to apply Bayesian inference to cost-effectiveness analysis so as to capture the uncertainty of a ratio-type efficiency measure. The first part of the paper summarizes the characteristics of the evaluation data that are commonly…

  12. Bayesian networks for clinical decision support : a rational approach to dynamic decision-making under uncertainty

    NARCIS (Netherlands)

    Gerven, M.A.J. van

    2007-01-01

    This dissertation deals with decision support in the context of clinical oncology. (Dynamic) Bayesian networks are used as a framework for (dynamic) decision-making under uncertainty and applied to a variety of diagnostic, prognostic, and treatment problems in medicine. It is shown that the proposed

  13. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  14. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  15. A Bayesian Approach to Identifying New Risk Factors for Dementia: A Nationwide Population-Based Study.

    Science.gov (United States)

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua

    2016-05-01

    Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics.We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database.We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia.We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients.

  16. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  17. A Bayesian Approach to Calibrating High-Throughput Virtual Screening Results and Application to Organic Photovoltaic Materials

    CERN Document Server

    Pyzer-Knapp, Edward O; Aspuru-Guzik, Alan

    2015-01-01

    A novel approach for calibrating quantum-chemical properties determined as part of a high-throughput virtual screen to experimental analogs is presented. Information on the molecular graph is extracted through the use of extended connectivity fingerprints, and exploited using a Gaussian process to calibrate both electronic properties such as frontier orbital energies, and optical gaps and device properties such as short circuit current density, open circuit voltage and power conversion efficiency. The Bayesian nature of this process affords a value for uncertainty in addition to each calibrated value. This allows the researcher to gain intuition about the model as well as the ability to respect its bounds.

  18. In-situ resource utilization for the human exploration of Mars : a Bayesian approach to valuation of precursor missions

    Science.gov (United States)

    Smith, Jeffrey H.

    2006-01-01

    The need for sufficient quantities of oxygen, water, and fuel resources to support a crew on the surface of Mars presents a critical logistical issue of whether to transport such resources from Earth or manufacture them on Mars. An approach based on the classical Wildcat Drilling Problem of Bayesian decision theory was applied to the problem of finding water in order to compute the expected value of precursor mission sample information. An implicit (required) probability of finding water on Mars was derived from the value of sample information using the expected mass savings of alternative precursor missions.

  19. AGNfitter: A Bayesian MCMC approach to fitting spectral energy distributions of AGN

    CERN Document Server

    Rivera, Gabriela Calistro; Hennawi, Joseph F; Hogg, David W

    2016-01-01

    We present AGNfitter, a publicly available open-source algorithm implementing a fully Bayesian Markov Chain Monte Carlo method to fit the spectral energy distributions (SEDs) of active galactic nuclei (AGN) from the sub-mm to the UV, allowing one to robustly disentangle the physical processes responsible for their emission. AGNfitter makes use of a large library of theoretical, empirical, and semi-empirical models to characterize both the nuclear and host galaxy emission simultaneously. The model consists of four physical emission components: an accretion disk, a torus of AGN heated dust, stellar populations, and cold dust in star forming regions. AGNfitter determines the posterior distributions of numerous parameters that govern the physics of AGN with a fully Bayesian treatment of errors and parameter degeneracies, allowing one to infer integrated luminosities, dust attenuation parameters, stellar masses, and star formation rates. We tested AGNfitter's performace on real data by fitting the SEDs of a sample...

  20. Diagnosis of combined faults in Rotary Machinery by Non-Naive Bayesian approach

    Science.gov (United States)

    Asr, Mahsa Yazdanian; Ettefagh, Mir Mohammad; Hassannejad, Reza; Razavi, Seyed Naser

    2017-02-01

    When combined faults happen in different parts of the rotating machines, their features are profoundly dependent. Experts are completely familiar with individuals faults characteristics and enough data are available from single faults but the problem arises, when the faults combined and the separation of characteristics becomes complex. Therefore, the experts cannot declare exact information about the symptoms of combined fault and its quality. In this paper to overcome this drawback, a novel method is proposed. The core idea of the method is about declaring combined fault without using combined fault features as training data set and just individual fault features are applied in training step. For this purpose, after data acquisition and resampling the obtained vibration signals, Empirical Mode Decomposition (EMD) is utilized to decompose multi component signals to Intrinsic Mode Functions (IMFs). With the use of correlation coefficient, proper IMFs for feature extraction are selected. In feature extraction step, Shannon energy entropy of IMFs was extracted as well as statistical features. It is obvious that most of extracted features are strongly dependent. To consider this matter, Non-Naive Bayesian Classifier (NNBC) is appointed, which release the fundamental assumption of Naive Bayesian, i.e., the independence among features. To demonstrate the superiority of NNBC, other counterpart methods, include Normal Naive Bayesian classifier, Kernel Naive Bayesian classifier and Back Propagation Neural Networks were applied and the classification results are compared. An experimental vibration signals, collected from automobile gearbox, were used to verify the effectiveness of the proposed method. During the classification process, only the features, related individually to healthy state, bearing failure and gear failures, were assigned for training the classifier. But, combined fault features (combined gear and bearing failures) were examined as test data. The achieved

  1. Empirical vs Bayesian approach for estimating haplotypes from genotypes of unrelated individuals

    Directory of Open Access Journals (Sweden)

    Cheng Jacob

    2007-01-01

    Full Text Available Abstract Background The completion of the HapMap project has stimulated further development of haplotype-based methodologies for disease associations. A key aspect of such development is the statistical inference of individual diplotypes from unphased genotypes. Several methodologies for inferring haplotypes have been developed, but they have not been evaluated extensively to determine which method not only performs well, but also can be easily incorporated in downstream haplotype-based association analyses. In this paper, we attempt to do so. Our evaluation was carried out by comparing the two leading Bayesian methods, implemented in PHASE and HAPLOTYPER, and the two leading empirical methods, implemented in PL-EM and HPlus. We used these methods to analyze real data, namely the dense genotypes on X-chromosome of 30 European and 30 African trios provided by the International HapMap Project, and simulated genotype data. Our conclusions are based on these analyses. Results All programs performed very well on X-chromosome data, with an average similarity index of 0.99 and an average prediction rate of 0.99 for both European and African trios. On simulated data with approximation of coalescence, PHASE implementing the Bayesian method based on the coalescence approximation outperformed other programs on small sample sizes. When the sample size increased, other programs performed as well as PHASE. PL-EM and HPlus implementing empirical methods required much less running time than the programs implementing the Bayesian methods. They required only one hundredth or thousandth of the running time required by PHASE, particularly when analyzing large sample sizes and large umber of SNPs. Conclusion For large sample sizes (hundreds or more, which most association studies require, the two empirical methods might be used since they infer the haplotypes as accurately as any Bayesian methods and can be incorporated easily into downstream haplotype

  2. A Bayesian approach for Inter-seismic Inter-plate Coupling Probabilities for the Central Andes Subduction Zone

    Science.gov (United States)

    Ortega Culaciati, F. H.; Simons, M.

    2009-12-01

    We aim to characterize the apparent extent of plate coupling on subduction zone megathrusts with the eventual goal of understanding spatial variations of fault zone rheology. In this study we approach the problem from a Bayesian perspective, where we ask not for a single optimum model, but rather for a posteriori estimates of the range of allowable models, exploiting the full potential of Bayesian methods to completely characterize the model parameter space. Adopting a simple kinematic back-slip model and a 3D geometry of the inter-plate contact zone, we use the Bayesian approach to provide the inter-seismic inter-plate coupling probabilities that are consistent with physically plausible a-priori information and available geodetic measurements. We highlight the importance of using the vertical component of the velocity field to properly constrain the downdip limit of the coupled zone, and also we show how the chosen parameterization of the model plays an important role along with the a-priori, and a-posteriori information on the model parameters. We apply this methodology in the Chilean-Peruvian subduction zone (12S - 24S) with the desire to understand the state of inter-seismic coupling on that margin. We obtain patch like features for the probability of 100% apparent inter-seismic coupling with higher values located between 15km and 60km depth. The larger of these features are located in the regions associated with the rupture process of the 2001 (Mw 8.4) Arequipa and the 2007 (Mw 8.0) Pisco Earthquakes, both occurred after the time period where the measurements take place; and the region identified as the Arica bend seismic gap, which has not experienced a large earthquake since 1877.

  3. Pilot study of dynamic Bayesian networks approach for fault diagnostics and accident progression prediction in HTR-PM

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Yunfei; Tong, Jiejuan; Zhang, Liguo, E-mail: lgzhang@tsinghua.edu.cn; Zhang, Qin

    2015-09-15

    Highlights: • Dynamic Bayesian network is used to diagnose and predict accident progress in HTR-PM. • Dynamic Bayesian network model of HTR-PM is built based on detailed system analysis. • LOCA Simulations validate the above model even if part monitors are lost or false. - Abstract: The first high-temperature-reactor pebble-bed demonstration module (HTR-PM) is under construction currently in China. At the same time, development of a system that is used to support nuclear emergency response is in progress. The supporting system is expected to complete two tasks. The first one is diagnostics of the fault in the reactor based on abnormal sensor measurements obtained. The second one is prognostic of the accident progression based on sensor measurements obtained and operator actions. Both tasks will provide valuable guidance for emergency staff to take appropriate protective actions. Traditional method for the two tasks relies heavily on expert judgment, and has been proven to be inappropriate in some cases, such as Three Mile Island accident. To better perform the two tasks, dynamic Bayesian networks (DBN) is introduced in this paper and a pilot study based on the approach is carried out. DBN is advantageous in representing complex dynamic systems and taking full consideration of evidences obtained to perform diagnostics and prognostics. Pearl's loopy belief propagation (LBP) algorithm is recommended for diagnostics and prognostics in DBN. The DBN model of HTR-PM is created based on detailed system analysis and accident progression analysis. A small break loss of coolant accident (SBLOCA) is selected to illustrate the application of the DBN model of HTR-PM in fault diagnostics (FD) and accident progression prognostics (APP). Several advantages of DBN approach compared with other techniques are discussed. The pilot study lays the foundation for developing the nuclear emergency response supporting system (NERSS) for HTR-PM.

  4. A multiscale Bayesian data integration approach for mapping air dose rates around the Fukushima Daiichi Nuclear Power Plant.

    Science.gov (United States)

    Wainwright, Haruko M; Seki, Akiyuki; Chen, Jinsong; Saito, Kimiaki

    2017-02-01

    This paper presents a multiscale data integration method to estimate the spatial distribution of air dose rates in the regional scale around the Fukushima Daiichi Nuclear Power Plant. We integrate various types of datasets, such as ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. The Bayesian method allows us to quantify the uncertainty in the estimates, and to provide the confidence intervals that are critical for robust decision-making. Although this approach is primarily data-driven, it has great flexibility to include mechanistic models for representing radiation transport or other complex correlations. We demonstrate our approach using three types of datasets collected at the same time over Fukushima City in Japan: (1) coarse-resolution airborne surveys covering the entire area, (2) car surveys along major roads, and (3) walk surveys in multiple neighborhoods. Results show that the method can successfully integrate three types of datasets and create an integrated map (including the confidence intervals) of air dose rates over the domain in high resolution. Moreover, this study provides us with various insights into the characteristics of each dataset, as well as radiocaesium distribution. In particular, the urban areas show high heterogeneity in the contaminant distribution due to human activities as well as large discrepancy among different surveys due to such heterogeneity.

  5. Dealing with missing covariates in epidemiologic studies: a comparison between multiple imputation and a full Bayesian approach.

    Science.gov (United States)

    Erler, Nicole S; Rizopoulos, Dimitris; Rosmalen, Joost van; Jaddoe, Vincent W V; Franco, Oscar H; Lesaffre, Emmanuel M E H

    2016-07-30

    Incomplete data are generally a challenge to the analysis of most large studies. The current gold standard to account for missing data is multiple imputation, and more specifically multiple imputation with chained equations (MICE). Numerous studies have been conducted to illustrate the performance of MICE for missing covariate data. The results show that the method works well in various situations. However, less is known about its performance in more complex models, specifically when the outcome is multivariate as in longitudinal studies. In current practice, the multivariate nature of the longitudinal outcome is often neglected in the imputation procedure, or only the baseline outcome is used to impute missing covariates. In this work, we evaluate the performance of MICE using different strategies to include a longitudinal outcome into the imputation models and compare it with a fully Bayesian approach that jointly imputes missing values and estimates the parameters of the longitudinal model. Results from simulation and a real data example show that MICE requires the analyst to correctly specify which components of the longitudinal process need to be included in the imputation models in order to obtain unbiased results. The full Bayesian approach, on the other hand, does not require the analyst to explicitly specify how the longitudinal outcome enters the imputation models. It performed well under different scenarios. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Bayesian approach to the assessment of the population-specific risk of inhibitors in hemophilia A patients: a case study

    Directory of Open Access Journals (Sweden)

    Cheng J

    2016-10-01

    significant inhibitor (10/100, 5/100 [high rates], and 1/86 [the Food and Drug Administration mandated cutoff rate in PTPs] were calculated. The effect of discounting prior information or scaling up the study data was evaluated.Results: Results based on noninformative priors were similar to the classical approach. Using priors from PTPs lowered the point estimate and narrowed the 95% credible intervals (Case 1: from 1.3 [0.5, 2.7] to 0.8 [0.5, 1.1]; Case 2: from 1.9 [0.6, 6.0] to 0.8 [0.5, 1.1]; Case 3: 2.3 [0.5, 6.8] to 0.7 [0.5, 1.1]. All probabilities of satisfying a threshold of 1/86 were above 0.65. Increasing the number of patients by two and ten times substantially narrowed the credible intervals for the single cohort study (1.4 [0.7, 2.3] and 1.4 [1.1, 1.8], respectively. Increasing the number of studies by two and ten times for the multiple study scenarios (Case 2: 1.9 [0.6, 4.0] and 1.9 [1.5, 2.6]; Case 3: 2.4 [0.9, 5.0] and 2.6 [1.9, 3.5], respectively had a similar effect.Conclusion: Bayesian approach as a robust, transparent, and reproducible analytic method can be efficiently used to estimate the inhibitor rate of hemophilia A in complex clinical settings. Keywords: inhibitor rate, meta-analysis, multicentric study, Bayesian, hemophilia A

  7. A Bayesian Mean-Value Approach with a Self-Consistently Determined Prior Distribution for the Ranking of College Football Teams

    CERN Document Server

    Ashburn, J R; Ashburn, James R.; Colvert, Paul M.

    2006-01-01

    We introduce a Bayesian mean-value approach for ranking all college football teams using only win-loss data. This approach is unique in that the prior distribution necessary to handle undefeated and winless teams is calculated self-consistently. Furthermore, we will show statistics supporting the validity of the prior distribution. Finally, a brief comparison with other football rankings will be presented.

  8. Combining non-precise historical information with instrumental measurements for flood frequency estimation: a fuzzy Bayesian approach

    Science.gov (United States)

    Salinas, Jose Luis; Kiss, Andrea; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Efforts of the historical environmental extremes community during the last decades have resulted in the obtention of long time series of historical floods, which in some cases range longer than 500 years in the past. In hydrological engineering, historical floods are useful because they give additional information which improves the estimates of discharges with low annual exceedance probabilities, i.e. with high return periods, and additionally might reduce the uncertainty in those estimates. In order to use the historical floods in formal flood frequency analysis, the precise value of the peak discharges would ideally be known, but in most of the cases, the information related to historical floods is given, quantitatively, in a non-precise manner. This work presents an approach on how to deal with the non-precise historical floods, by linking the descriptions in historical records to fuzzy numbers representing discharges. These fuzzy historical discharges are then introduced in a formal Bayesian inference framework, taking into account the arithmetics of non-precise numbers modelled by fuzzy logic theory, to obtain a fuzzy version of the flood frequency curve combining the fuzzy historical flood events and the instrumental data for a given location. Two case studies are selected from the historical literature, representing different facets of the fuzziness present in the historical sources. The results from the cases studies are given in the form of the fuzzy estimates of the flood frequency curves together with the fuzzy 5% and 95% Bayesian credibility bounds for these curves. The presented fuzzy Bayesian inference framework provides a flexible methodology to propagate in an explicit way the imprecision from the historical records into the flood frequency estimate, which allows to assess the effect that the incorporation of non-precise historical information can have in the flood frequency regime.

  9. Bayesian neural network approach for determining the risk of re-intervention after endovascular aortic aneurysm repair.

    Science.gov (United States)

    Attallah, Omneya; Ma, Xianghong

    2014-09-01

    This article proposes a Bayesian neural network approach to determine the risk of re-intervention after endovascular aortic aneurysm repair surgery. The target of proposed technique is to determine which patients have high chance to re-intervention (high-risk patients) and which are not (low-risk patients) after 5 years of the surgery. Two censored datasets relating to the clinical conditions of aortic aneurysms have been collected from two different vascular centers in the United Kingdom. A Bayesian network was first employed to solve the censoring issue in the datasets. Then, a back propagation neural network model was built using the uncensored data of the first center to predict re-intervention on the second center and classify the patients into high-risk and low-risk groups. Kaplan-Meier curves were plotted for each group of patients separately to show whether there is a significant difference between the two risk groups. Finally, the logrank test was applied to determine whether the neural network model was capable of predicting and distinguishing between the two risk groups. The results show that the Bayesian network used for uncensoring the data has improved the performance of the neural networks that were built for the two centers separately. More importantly, the neural network that was trained with uncensored data of the first center was able to predict and discriminate between groups of low risk and high risk of re-intervention after 5 years of endovascular aortic aneurysm surgery at center 2 (p = 0.0037 in the logrank test).

  10. The influence of baseline marijuana use on treatment of cocaine dependence: application of an informative-priors Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Charles eGreen

    2012-10-01

    Full Text Available Background: Marijuana use is prevalent among patients with cocaine dependence and often non-exclusionary in clinical trials of potential cocaine medications. The dual-focus of this study was to (1 examine the moderating effect of baseline marijuana use on response to treatment with levodopa/carbidopa for cocaine dependence; and (2 apply an informative-priors, Bayesian approach for estimating the probability of a subgroup-by-treatment interaction effect.Method: A secondary data analysis of two previously published, double-blind, randomized controlled trials provided samples for the historical dataset (Study 1: N = 64 complete observations and current dataset (Study 2: N = 113 complete observations. Negative binomial regression evaluated Treatment Effectiveness Scores (TES as a function of medication condition (levodopa/carbidopa, placebo, baseline marijuana use (days in past 30, and their interaction. Results: Bayesian analysis indicated that there was a 96% chance that baseline marijuana use predicts differential response to treatment with levodopa/carbidopa. Simple effects indicated that among participants receiving levodopa/carbidopa the probability that baseline marijuana confers harm in terms of reducing TES was 0.981; whereas the probability that marijuana confers harm within the placebo condition was 0.163. For every additional day of marijuana use reported at baseline, participants in the levodopa/carbidopa condition demonstrated a 5.4% decrease in TES; while participants in the placebo condition demonstrated a 4.9% increase in TES.Conclusion: The potential moderating effect of marijuana on cocaine treatment response should be considered in future trial designs. Applying Bayesian subgroup analysis proved informative in characterizing this patient-treatment interaction effect.

  11. Identifiability of sorption parameters in stirred flow-through reactor experiments and their identification with a Bayesian approach.

    Science.gov (United States)

    Nicoulaud-Gouin, V; Garcia-Sanchez, L; Giacalone, M; Attard, J C; Martin-Garin, A; Bois, F Y

    2016-10-01

    This paper addresses the methodological conditions -particularly experimental design and statistical inference- ensuring the identifiability of sorption parameters from breakthrough curves measured during stirred flow-through reactor experiments also known as continuous flow stirred-tank reactor (CSTR) experiments. The equilibrium-kinetic (EK) sorption model was selected as nonequilibrium parameterization embedding the Kd approach. Parameter identifiability was studied formally on the equations governing outlet concentrations. It was also studied numerically on 6 simulated CSTR experiments on a soil with known equilibrium-kinetic sorption parameters. EK sorption parameters can not be identified from a single breakthrough curve of a CSTR experiment, because Kd,1 and k(-) were diagnosed collinear. For pairs of CSTR experiments, Bayesian inference allowed to select the correct models of sorption and error among sorption alternatives. Bayesian inference was conducted with SAMCAT software (Sensitivity Analysis and Markov Chain simulations Applied to Transfer models) which launched the simulations through the embedded simulation engine GNU-MCSim, and automated their configuration and post-processing. Experimental designs consisting in varying flow rates between experiments reaching equilibrium at contamination stage were found optimal, because they simultaneously gave accurate sorption parameters and predictions. Bayesian results were comparable to maximum likehood method but they avoided convergence problems, the marginal likelihood allowed to compare all models, and credible interval gave directly the uncertainty of sorption parameters θ. Although these findings are limited to the specific conditions studied here, in particular the considered sorption model, the chosen parameter values and error structure, they help in the conception and analysis of future CSTR experiments with radionuclides whose kinetic behaviour is suspected.

  12. Uncertainty analysis for effluent trading planning using a Bayesian estimation-based simulation-optimization modeling approach.

    Science.gov (United States)

    Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J

    2017-03-06

    In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision

  13. Bayesian approaches to infer the physical properties of star-forming galaxies at cosmic dawn

    Science.gov (United States)

    Salmon, Brett Weston Killebrew

    In this thesis, I seek to advance our understanding of galaxy formation and evolution in the early universe. Using the largest single project ever conducted by the Hubble Space Telescope (the Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey, CANDELS) I use deep and wide broadband photometric imaging to infer the physical properties of galaxies from z=8.5 to z=1.5. First, I will present a study that extends the relationship between the star-formation rates (SFRs) and stellar masses (M⋆) of galaxies to 3.5attenuated in galaxies. I calculate the Bayesian evidence for galaxies under different assumptions of their underlying dust-attenuation law. By modeling galaxy ultraviolet-to-near-IR broadband CANDELS data I produce Bayesian evidence towards the dust law in individual galaxies that is confirmed by their observed IR luminosities. Moreover, I find a tight correlation between the strength of attenuation in galaxies and their dust law, a relation reinforced by the results from radiative transfer simulations. Finally, I use the Bayesian methods developed in this thesis to study the number density of SFR in galaxies from z=8 to z=4, and resolve the current disconnect between its evolution and that of the stellar mass function. In doing so, I place the first constraints on the dust law of z>4 galaxies, finding it obeys a similar relation as found at z˜2. I find a clear excess in number density at high SFRs. This new SFR function is in better agreement with the observed stellar mass functions, the few to-date infrared detections at high redshifts, and the connection to the observed distribution of lower redshift infrared sources. Together, these studies greatly improve our understanding of the galaxy star-formation histories, the nature of their dust attenuation, and the distribution of SFR among some of the most distant galaxies in the universe.

  14. Air Kerma Rate estimation by means of in-situ gamma spectrometry: a Bayesian approach.

    Science.gov (United States)

    Cabal, Gonzalo; Kluson, Jaroslav

    2010-01-01

    Bayesian inference is used to determine the Air Kerma Rate based on in-situ gamma spectrum measurement performed with an NaI(Tl) scintillation detector. The procedure accounts for uncertainties in the measurement and in the mass energy transfer coefficients needed for the calculation. The WinBUGS program (Spiegelhalter et al., 1999) was used. The results show that the relative uncertainties in the Air Kerma estimate are of about 1%, and that the choice of unfolding procedure may lead to an estimate systematic error of 3%.

  15. Air Kerma Rate estimation by means of in-situ gamma spectrometry: A Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Cabal, Gonzalo [Department of Dosimetry and Applications of Ionizing Radiation, Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University in Prague, Brehova 7, 115 19 Prague 1 (Czech Republic); Department of Radiation Dosimetry, Nuclear Physics Institute, Academy of Sciences of the Czech Republic, Na Truhlarce 39/64, 180 86 Prague 8 (Czech Republic)], E-mail: cabal@ujf.cas.cz; Kluson, Jaroslav [Department of Dosimetry and Applications of Ionizing Radiation, Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University in Prague, Brehova 7, 115 19 Prague 1 (Czech Republic)

    2010-04-15

    Bayesian inference is used to determine the Air Kerma Rate based on in-situ gamma spectrum measurement performed with an NaI(Tl) scintillation detector. The procedure accounts for uncertainties in the measurement and in the mass energy transfer coefficients needed for the calculation. The WinBUGS program () was used. The results show that the relative uncertainties in the Air Kerma estimate are of about 1%, and that the choice of unfolding procedure may lead to an estimate systematic error of 3%.

  16. A two step Bayesian approach for genomic prediction of breeding values

    DEFF Research Database (Denmark)

    Mahdi Shariati, Mohammad; Sørensen, Peter; Janss, Luc

    2012-01-01

    Background: In genomic models that assign an individual variance to each marker, the contribution of one marker to the posterior distribution of the marker variance is only one degree of freedom (df), which introduces many variance parameters with only little information per variance parameter...... of predicted breeding values. However, the accuracies of predicted breeding values were lower than Bayesian methods with marker specific variances. Conclusions: Grouping markers is less flexible than allowing each marker to have a specific marker variance but, by grouping, the power to estimate marker...

  17. Bayesian approaches to the value of information: implications for the regulation of new pharmaceuticals.

    Science.gov (United States)

    Claxton, K

    1999-05-01

    The current regulation of new pharmaceuticals is inefficient because it demands arbitrary amounts of information, the type of information demanded is not relevant to decision-makers and the same standards of evidence are applied across different technologies. Bayesian decision theory and an analysis of the value of both perfect and sample information is used to consider the efficient regulation of new pharmaceuticals. This type of analysis can be used to decide whether the evidence in an economic study provides 'sufficient substantiation' for an economic claim, and assesses whether evidence can be regarded as 'competent and reliable'.

  18. A Bayesian Super-Resolution Approach to Demosaicing of Blurred Images

    Directory of Open Access Journals (Sweden)

    Molina Rafael

    2006-01-01

    Full Text Available Most of the available digital color cameras use a single image sensor with a color filter array (CFA in acquiring an image. In order to produce a visible color image, a demosaicing process must be applied, which produces undesirable artifacts. An additional problem appears when the observed color image is also blurred. This paper addresses the problem of deconvolving color images observed with a single coupled charged device (CCD from the super-resolution point of view. Utilizing the Bayesian paradigm, an estimate of the reconstructed image and the model parameters is generated. The proposed method is tested on real images.

  19. A population-based Bayesian approach to the minimal model of glucose and insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    2005-01-01

    for a whole population. Traditionally it has been analysed in a deterministic set-up with only error terms on the measurements. In this work we adopt a Bayesian graphical model to describe the coupled minimal model that accounts for both measurement and process variability, and the model is extended......-posed estimation problem, where the reconstruction most often has been done by non-linear least squares techniques separately for each entity. The minmal model was originally specified for a single individual and does not combine several individuals with the advantage of estimating the metabolic portrait...

  20. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  1. A hybrid approach to monthly streamflow forecasting: Integrating hydrological model outputs into a Bayesian artificial neural network

    Science.gov (United States)

    Humphrey, Greer B.; Gibbs, Matthew S.; Dandy, Graeme C.; Maier, Holger R.

    2016-09-01

    Monthly streamflow forecasts are needed to support water resources decision making in the South East of South Australia, where baseflow represents a significant proportion of the total streamflow and soil moisture and groundwater are important predictors of runoff. To address this requirement, the utility of a hybrid monthly streamflow forecasting approach is explored, whereby simulated soil moisture from the GR4J conceptual rainfall-runoff model is used to represent initial catchment conditions in a Bayesian artificial neural network (ANN) statistical forecasting model. To assess the performance of this hybrid forecasting method, a comparison is undertaken of the relative performances of the Bayesian ANN, the GR4J conceptual model and the hybrid streamflow forecasting approach for producing 1-month ahead streamflow forecasts at three key locations in the South East of South Australia. Particular attention is paid to the quantification of uncertainty in each of the forecast models and the potential for reducing forecast uncertainty by using the hybrid approach is considered. Case study results suggest that the hybrid models developed in this study are able to take advantage of the complementary strengths of both the ANN models and the GR4J conceptual models. This was particularly the case when forecasting high flows, where the hybrid models were shown to outperform the two individual modelling approaches in terms of the accuracy of the median forecasts, as well as reliability and resolution of the forecast distributions. In addition, the forecast distributions generated by the hybrid models were up to 8 times more precise than those based on climatology; thus, providing a significant improvement on the information currently available to decision makers.

  2. Predictability of extreme weather events for NE U.S.: improvement of the numerical prediction using a Bayesian regression approach

    Science.gov (United States)

    Yang, J.; Astitha, M.; Anagnostou, E. N.; Hartman, B.; Kallos, G. B.

    2015-12-01

    Weather prediction accuracy has become very important for the Northeast U.S. given the devastating effects of extreme weather events in the recent years. Weather forecasting systems are used towards building strategies to prevent catastrophic losses for human lives and the environment. Concurrently, weather forecast tools and techniques have evolved with improved forecast skill as numerical prediction techniques are strengthened by increased super-computing resources. In this study, we examine the combination of two state-of-the-science atmospheric models (WRF and RAMS/ICLAMS) by utilizing a Bayesian regression approach to improve the prediction of extreme weather events for NE U.S. The basic concept behind the Bayesian regression approach is to take advantage of the strengths of two atmospheric modeling systems and, similar to the multi-model ensemble approach, limit their weaknesses which are related to systematic and random errors in the numerical prediction of physical processes. The first part of this study is focused on retrospective simulations of seventeen storms that affected the region in the period 2004-2013. Optimal variances are estimated by minimizing the root mean square error and are applied to out-of-sample weather events. The applicability and usefulness of this approach are demonstrated by conducting an error analysis based on in-situ observations from meteorological stations of the National Weather Service (NWS) for wind speed and wind direction, and NCEP Stage IV radar data, mosaicked from the regional multi-sensor for precipitation. The preliminary results indicate a significant improvement in the statistical metrics of the modeled-observed pairs for meteorological variables using various combinations of the sixteen events as predictors of the seventeenth. This presentation will illustrate the implemented methodology and the obtained results for wind speed, wind direction and precipitation, as well as set the research steps that will be

  3. FORMALIZATION OF PROCESS OF DECISION-MAKING ON MANAGEMENT OF SAFETY OF INFORMATION IN THE AUTOMATED SYSTEMS ABOUT USE OF BAYESIAN APPROACH

    Directory of Open Access Journals (Sweden)

    Stepanov V. V.

    2015-09-01

    Full Text Available The article presents a model for choosing a variety of alternative solutions, in which we have a subset of turns or more alternative options, based on the use of the Bayesian approach, based on the formulated concept of security functions as a priori estimate of the effects of the decision. This reduces the projected parameters and, therefore, increases the values of security. Thus, the considered indicators of data protection reflect the essence of Bayesian approach to decision making and management of GIS, so it allows to generate optimal decision rules

  4. A novel approach for pilot error detection using Dynamic Bayesian Networks.

    Science.gov (United States)

    Saada, Mohamad; Meng, Qinggang; Huang, Tingwen

    2014-06-01

    In the last decade Dynamic Bayesian Networks (DBNs) have become one type of the most attractive probabilistic modelling framework extensions of Bayesian Networks (BNs) for working under uncertainties from a temporal perspective. Despite this popularity not many researchers have attempted to study the use of these networks in anomaly detection or the implications of data anomalies on the outcome of such models. An abnormal change in the modelled environment's data at a given time, will cause a trailing chain effect on data of all related environment variables in current and consecutive time slices. Albeit this effect fades with time, it still can have an ill effect on the outcome of such models. In this paper we propose an algorithm for pilot error detection, using DBNs as the modelling framework for learning and detecting anomalous data. We base our experiments on the actions of an aircraft pilot, and a flight simulator is created for running the experiments. The proposed anomaly detection algorithm has achieved good results in detecting pilot errors and effects on the whole system.

  5. A Bayesian approach to estimate the biomass of anchovies off the coast of Perú.

    Science.gov (United States)

    Quiroz, Zaida C; Prates, Marcos O; Rue, Håvard

    2015-03-01

    The Northern Humboldt Current System (NHCS) is the world's most productive ecosystem in terms of fish. In particular, the Peruvian anchovy (Engraulis ringens) is the major prey of the main top predators, like seabirds, fish, humans, and other mammals. In this context, it is important to understand the dynamics of the anchovy distribution to preserve it as well as to exploit its economic capacities. Using the data collected by the "Instituto del Mar del Perú" (IMARPE) during a scientific survey in 2005, we present a statistical analysis that has as main goals: (i) to adapt to the characteristics of the sampled data, such as spatial dependence, high proportions of zeros and big size of samples; (ii) to provide important insights on the dynamics of the anchovy population; and (iii) to propose a model for estimation and prediction of anchovy biomass in the NHCS offshore from Perú. These data were analyzed in a Bayesian framework using the integrated nested Laplace approximation (INLA) method. Further, to select the best model and to study the predictive power of each model, we performed model comparisons and predictive checks, respectively. Finally, we carried out a Bayesian spatial influence diagnostic for the preferred model.

  6. Gas turbine engine prognostics using Bayesian hierarchical models: A variational approach

    Science.gov (United States)

    Zaidan, Martha A.; Mills, Andrew R.; Harrison, Robert F.; Fleming, Peter J.

    2016-03-01

    Prognostics is an emerging requirement of modern health monitoring that aims to increase the fidelity of failure-time predictions by the appropriate use of sensory and reliability information. In the aerospace industry it is a key technology to reduce life-cycle costs, improve reliability and asset availability for a diverse fleet of gas turbine engines. In this work, a Bayesian hierarchical model is selected to utilise fleet data from multiple assets to perform probabilistic estimation of remaining useful life (RUL) for civil aerospace gas turbine engines. The hierarchical formulation allows Bayesian updates of an individual predictive model to be made, based upon data received asynchronously from a fleet of assets with different in-service lives and for the entry of new assets into the fleet. In this paper, variational inference is applied to the hierarchical formulation to overcome the computational and convergence concerns that are raised by the numerical sampling techniques needed for inference in the original formulation. The algorithm is tested on synthetic data, where the quality of approximation is shown to be satisfactory with respect to prediction performance, computational speed, and ease of use. A case study of in-service gas turbine engine data demonstrates the value of integrating fleet data for accurately predicting degradation trajectories of assets.

  7. A bayesian approach for determining velocity and uncertainty estimates from seismic cone penetrometer testing or vertical seismic profiling data

    Science.gov (United States)

    Pidlisecky, A.; Haines, S.S.

    2011-01-01

    Conventional processing methods for seismic cone penetrometer data present several shortcomings, most notably the absence of a robust velocity model uncertainty estimate. We propose a new seismic cone penetrometer testing (SCPT) data-processing approach that employs Bayesian methods to map measured data errors into quantitative estimates of model uncertainty. We first calculate travel-time differences for all permutations of seismic trace pairs. That is, we cross-correlate each trace at each measurement location with every trace at every other measurement location to determine travel-time differences that are not biased by the choice of any particular reference trace and to thoroughly characterize data error. We calculate a forward operator that accounts for the different ray paths for each measurement location, including refraction at layer boundaries. We then use a Bayesian inversion scheme to obtain the most likely slowness (the reciprocal of velocity) and a distribution of probable slowness values for each model layer. The result is a velocity model that is based on correct ray paths, with uncertainty bounds that are based on the data error. ?? NRC Research Press 2011.

  8. Interactive Naive Bayesian network: A new approach of constructing gene-gene interaction network for cancer classification.

    Science.gov (United States)

    Tian, Xue W; Lim, Joon S

    2015-01-01

    Naive Bayesian (NB) network classifier is a simple and well-known type of classifier, which can be easily induced from a DNA microarray data set. However, a strong conditional independence assumption of NB network sometimes can lead to weak classification performance. In this paper, we propose a new approach of interactive naive Bayesian (INB) network to weaken the conditional independence of NB network and classify cancers using DNA microarray data set. We selected the differently expressed genes (DEGs) to reduce the dimension of the microarray data set. Then, an interactive parent which has the biggest influence among all DEGs is searched for each DEG. And then we calculate a weight to represent the interactive relationship between a DEG and its parent. Finally, the gene-gene interaction network is constructed. We experimentally test the INB network in terms of classification accuracy using leukemia and colon DNA microarray data sets, then we compare it with the NB network. The INB network can get higher classification accuracies than NB network. And INB network can show the gene-gene interactions visually.

  9. Bayesian informative dropout model for longitudinal binary data with random effects using conditional and joint modeling approaches.

    Science.gov (United States)

    Chan, Jennifer S K

    2016-05-01

    Dropouts are common in longitudinal study. If the dropout probability depends on the missing observations at or after dropout, this type of dropout is called informative (or nonignorable) dropout (ID). Failure to accommodate such dropout mechanism into the model will bias the parameter estimates. We propose a conditional autoregressive model for longitudinal binary data with an ID model such that the probabilities of positive outcomes as well as the drop-out indicator in each occasion are logit linear in some covariates and outcomes. This model adopting a marginal model for outcomes and a conditional model for dropouts is called a selection model. To allow for the heterogeneity and clustering effects, the outcome model is extended to incorporate mixture and random effects. Lastly, the model is further extended to a novel model that models the outcome and dropout jointly such that their dependency is formulated through an odds ratio function. Parameters are estimated by a Bayesian approach implemented using the user-friendly Bayesian software WinBUGS. A methadone clinic dataset is analyzed to illustrate the proposed models. Result shows that the treatment time effect is still significant but weaker after allowing for an ID process in the data. Finally the effect of drop-out on parameter estimates is evaluated through simulation studies.

  10. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    Science.gov (United States)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  11. Development of a fingerprint reduction approach for Bayesian similarity searching based on Kullback-Leibler divergence analysis.

    Science.gov (United States)

    Nisius, Britta; Vogt, Martin; Bajorath, Jürgen

    2009-06-01

    The contribution of individual fingerprint bit positions to similarity search performance is systematically evaluated. A method is introduced to determine bit significance on the basis of Kullback-Leibler divergence analysis of bit distributions in active and database compounds. Bit divergence analysis and Bayesian compound screening share a common methodological foundation. Hence, given the significance ranking of all individual bit positions comprising a fingerprint, subsets of bits are evaluated in the context of Bayesian screening, and minimal fingerprint representations are determined that meet or exceed the search performance of unmodified fingerprints. For fingerprints of different design evaluated on many compound activity classes, we consistently find that subsets of fingerprint bit positions are responsible for search performance. In part, these subsets are very small and contain in some cases only a few fingerprint bit positions. Structural or pharmacophore patterns captured by preferred bit positions can often be directly associated with characteristic features of active compounds. In some cases, reduced fingerprint representations clearly exceed the search performance of the original fingerprints. Thus, fingerprint reduction likely represents a promising approach for practical applications.

  12. Non-arbitrage in financial markets: A Bayesian approach for verification

    Science.gov (United States)

    Cerezetti, F. V.; Stern, Julio Michael

    2012-10-01

    The concept of non-arbitrage plays an essential role in finance theory. Under certain regularity conditions, the Fundamental Theorem of Asset Pricing states that, in non-arbitrage markets, prices of financial instruments are martingale processes. In this theoretical framework, the analysis of the statistical distributions of financial assets can assist in understanding how participants behave in the markets, and may or may not engender arbitrage conditions. Assuming an underlying Variance Gamma statistical model, this study aims to test, using the FBST - Full Bayesian Significance Test, if there is a relevant price difference between essentially the same financial asset traded at two distinct locations. Specifically, we investigate and compare the behavior of call options on the BOVESPA Index traded at (a) the Equities Segment and (b) the Derivatives Segment of BM&FBovespa. Our results seem to point out significant statistical differences. To what extent this evidence is actually the expression of perennial arbitrage opportunities is still an open question.

  13. Gaussian process surrogates for failure detection: A Bayesian experimental design approach

    Science.gov (United States)

    Wang, Hongqiao; Lin, Guang; Li, Jinglai

    2016-05-01

    An important task of uncertainty quantification is to identify the probability of undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian process surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples.

  14. Insurance penetration and economic growth in Africa: Dynamic effects analysis using Bayesian TVP-VAR approach

    Directory of Open Access Journals (Sweden)

    D.O. Olayungbo

    2016-12-01

    Full Text Available This paper examines the dynamic interactions between insurance and economic growth in eight African countries for the period of 1970–2013. Insurance demand is measured by insurance penetration which accounts for income differences across the sample countries. A Bayesian Time Varying Parameter Vector Auto regression (TVP-VAR model with stochastic volatility is used to analyze the short run and the long run among the variables of interest. Using insurance penetration as a measure of insurance to economic growth, we find positive relationship for Egypt, while short-run negative and long-run positive effects are found for Kenya, Mauritius, and South Africa. On the contrary, negative effects are found for Algeria, Nigeria, Tunisia, and Zimbabwe. Implementation of sound financial reforms and wide insurance coverage are proposed recommendations for insurance development in the selected African countries.

  15. Bayesian Framework Approach for Prognostic Studies in Electrolytic Capacitor under Thermal Overstress Conditions

    Science.gov (United States)

    Kulkarni, Chetan S.; Celaya, Jose R.; Goebel, Kai; Biswas, Gautam

    2012-01-01

    Electrolytic capacitors are used in several applications ranging from power supplies for safety critical avionics equipment to power drivers for electro-mechanical actuator. Past experiences show that capacitors tend to degrade and fail faster when subjected to high electrical or thermal stress conditions during operations. This makes them good candidates for prognostics and health management. Model-based prognostics captures system knowledge in the form of physics-based models of components in order to obtain accurate predictions of end of life based on their current state of heal th and their anticipated future use and operational conditions. The focus of this paper is on deriving first principles degradation models for thermal stress conditions and implementing Bayesian framework for making remaining useful life predictions. Data collected from simultaneous experiments are used to validate the models. Our overall goal is to derive accurate models of capacitor degradation, and use them to remaining useful life in DC-DC converters.

  16. Analysis and assessment of injury risk in female gymnastics:Bayesian Network approach

    Directory of Open Access Journals (Sweden)

    Lyudmila Dimitrova

    2015-02-01

    Full Text Available This paper presents a Bayesian network (BN model for estimating injury risk in female artistic gymnastics. The model illustrates the connections betweenunderlying injury risk factorsthrough a series ofcausal dependencies. The quantitativepart of the model – the conditional probability tables, are determined using ТNormal distribution with parameters, derived by experts. The injury rates calculated by the network are in an agreement with injury statistic data and correctly reports the impact of various risk factors on injury rates. The model is designed to assist coaches and supporting teams in planning the training activity so that injuries are minimized. This study provides important background for further data collection and research necessary to improve the precision of the quantitative predictions of the model.

  17. Seismogenic stress field estimation in the Calabrian Arc region (south Italy) from a Bayesian approach

    Science.gov (United States)

    Totaro, C.; Orecchio, B.; Presti, D.; Scolaro, S.; Neri, G.

    2016-09-01

    A new high-quality waveform inversion focal mechanism database of the Calabrian Arc region has been compiled by integrating 292 mechanisms selected from literature and catalogs with 146 newly computed solutions. The new database has then been used for computation of posterior density distributions of stress tensor components by a Bayesian method never applied in south Italy before the present study. The application of this method to the enhanced database has allowed us to provide a detailed picture of seismotectonic stress regimes in this very complex area where lithospheric unit configuration and geodynamic engines are still strongly debated. Our results well constrain the extensional domain of Calabrian Arc and the compressional one of the southernmost Tyrrhenian Sea. In addition, previously undetected transcurrent regimes have been identified in the Ionian offshore. The new information released here will furnish useful tools and constraints for future geodynamic investigations.

  18. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John

    compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, ii) measurement uncertainty, and iii...... a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners......) uncertain source zone and transport parameters. The method generates multiple equally likely realisations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realisations are generated by co-simulating the hydraulic conductivity...

  19. Using Data to Tune Nearshore Dynamics Models: A Bayesian Approach with Parametric Likelihood

    CERN Document Server

    Balci, Nusret; Venkataramani, Shankar C

    2013-01-01

    We propose a modification of a maximum likelihood procedure for tuning parameter values in models, based upon the comparison of their output to field data. Our methodology, which uses polynomial approximations of the sample space to increase the computational efficiency, differs from similar Bayesian estimation frameworks in the use of an alternative likelihood distribution, is shown to better address problems in which covariance information is lacking, than its more conventional counterpart. Lack of covariance information is a frequent challenge in large-scale geophysical estimation. This is the case in the geophysical problem considered here. We use a nearshore model for long shore currents and observational data of the same to show the contrast between both maximum likelihood methodologies. Beyond a methodological comparison, this study gives estimates of parameter values for the bottom drag and surface forcing that make the particular model most consistent with data; furthermore, we also derive sensitivit...

  20. Replication of breast cancer susceptibility loci in whites and African Americans using a Bayesian approach.

    Science.gov (United States)

    O'Brien, Katie M; Cole, Stephen R; Poole, Charles; Bensen, Jeannette T; Herring, Amy H; Engel, Lawrence S; Millikan, Robert C

    2014-02-01

    Genome-wide association studies (GWAS) and candidate gene analyses have led to the discovery of several dozen genetic polymorphisms associated with breast cancer susceptibility, many of which are considered well-established risk factors for the disease. Despite attempts to replicate these same variant-disease associations in African Americans, the evaluable populations are often too small to produce precise or consistent results. We estimated the associations between 83 previously identified single nucleotide polymorphisms (SNPs) and breast cancer among Carolina Breast Cancer Study (1993-2001) participants using maximum likelihood, Bayesian, and hierarchical methods. The selected SNPs were previous GWAS hits (n = 22), near-hits (n = 19), otherwise well-established risk loci (n = 5), or located in the same genes as selected variants (n = 37). We successfully replicated 18 GWAS-identified SNPs in whites (n = 2,352) and 10 in African Americans (n = 1,447). SNPs in the fibroblast growth factor receptor 2 gene (FGFR2) and the TOC high mobility group box family member 3 gene (TOX3) were strongly associated with breast cancer in both races. SNPs in the mitochondrial ribosomal protein S30 gene (MRPS30), mitogen-activated protein kinase kinase kinase 1 gene (MAP3K1), zinc finger, MIZ-type containing 1 gene (ZMIZ1), and H19, imprinted maternally expressed transcript gene (H19) were associated with breast cancer in whites, and SNPs in the estrogen receptor 1 gene (ESR1) and H19 gene were associated with breast cancer in African Americans. We provide precise and well-informed race-stratified odds ratios for key breast cancer-related SNPs. Our results demonstrate the utility of Bayesian methods in genetic epidemiology and provide support for their application in small, etiologically driven investigations.

  1. AGNfitter: A Bayesian MCMC Approach to Fitting Spectral Energy Distributions of AGNs

    Science.gov (United States)

    Calistro Rivera, Gabriela; Lusso, Elisabeta; Hennawi, Joseph F.; Hogg, David W.

    2016-12-01

    We present AGNfitter, a publicly available open-source algorithm implementing a fully Bayesian Markov Chain Monte Carlo method to fit the spectral energy distributions (SEDs) of active galactic nuclei (AGNs) from the sub-millimeter to the UV, allowing one to robustly disentangle the physical processes responsible for their emission. AGNfitter makes use of a large library of theoretical, empirical, and semi-empirical models to characterize both the nuclear and host galaxy emission simultaneously. The model consists of four physical emission components: an accretion disk, a torus of AGN heated dust, stellar populations, and cold dust in star-forming regions. AGNfitter determines the posterior distributions of numerous parameters that govern the physics of AGNs with a fully Bayesian treatment of errors and parameter degeneracies, allowing one to infer integrated luminosities, dust attenuation parameters, stellar masses, and star-formation rates. We tested AGNfitter’s performance on real data by fitting the SEDs of a sample of 714 X-ray selected AGNs from the XMM-COSMOS survey, spectroscopically classified as Type1 (unobscured) and Type2 (obscured) AGNs by their optical-UV emission lines. We find that two independent model parameters, namely the reddening of the accretion disk and the column density of the dusty torus, are good proxies for AGN obscuration, allowing us to develop a strategy for classifying AGNs as Type1 or Type2, based solely on an SED-fitting analysis. Our classification scheme is in excellent agreement with the spectroscopic classification, giving a completeness fraction of ˜ 86 % and ˜ 70 % , and an efficiency of ˜ 80 % and ˜ 77 % , for Type1 and Type2 AGNs, respectively.

  2. Bayesian estimation of Karhunen-Loève expansions; A random subspace approach

    Science.gov (United States)

    Chowdhary, Kenny; Najm, Habib N.

    2016-08-01

    One of the most widely-used procedures for dimensionality reduction of high dimensional data is Principal Component Analysis (PCA). More broadly, low-dimensional stochastic representation of random fields with finite variance is provided via the well known Karhunen-Loève expansion (KLE). The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L2 sense, i.e., which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition) on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build probabilistic Karhunen-Loève expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.

  3. Hierarchical Bayesian approach for estimating physical properties in spiral galaxies: Age Maps for M74

    Science.gov (United States)

    Sánchez Gil, M. Carmen; Berihuete, Angel; Alfaro, Emilio J.; Pérez, Enrique; Sarro, Luis M.

    2015-09-01

    One of the fundamental goals of modern Astronomy is to estimate the physical parameters of galaxies from images in different spectral bands. We present a hierarchical Bayesian model for obtaining age maps from images in the Ha line (taken with Taurus Tunable Filter (TTF)), ultraviolet band (far UV or FUV, from GALEX) and infrared bands (24, 70 and 160 microns (μm), from Spitzer). As shown in [1], we present the burst ages for young stellar populations in the nearby and nearly face on galaxy M74. As it is shown in the previous work, the Hα to FUV flux ratio gives a good relative indicator of very recent star formation history (SFH). As a nascent star-forming region evolves, the Ha line emission declines earlier than the UV continuum, leading to a decrease in the HαFUV ratio. Through a specific star-forming galaxy model (Starburst 99, SB99), we can obtain the corresponding theoretical ratio Hα / FUV to compare with our observed flux ratios, and thus to estimate the ages of the observed regions. Due to the nature of the problem, it is necessary to propose a model of high complexity to take into account the mean uncertainties, and the interrelationship between parameters when the Hα / FUV flux ratio mentioned above is obtained. To address the complexity of the model, we propose a Bayesian hierarchical model, where a joint probability distribution is defined to determine the parameters (age, metallicity, IMF), from the observed data, in this case the observed flux ratios Hα / FUV. The joint distribution of the parameters is described through an i.i.d. (independent and identically distributed random variables), generated through MCMC (Markov Chain Monte Carlo) techniques.

  4. Intermediate-term forecasting of aftershocks from an early aftershock sequence: Bayesian and ensemble forecasting approaches

    Science.gov (United States)

    Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki

    2015-04-01

    Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.

  5. Bayesian and Geostatistical Approaches to Combining Categorical Data Derived from Visual and Digital Processing of Remotely Sensed Images

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jingxiong; LI Deren

    2005-01-01

    This paper seeks a synthesis of Bayesian and geostatistical approaches to combining categorical data in the context of remote sensing classification.By experiment with aerial photographs and Landsat TM data, accuracy of spectral, spatial, and combined classification results was evaluated.It was confirmed that the incorporation of spatial information in spectral classification increases accuracy significantly.Secondly, through test with a 5-class and a 3-class classification schemes, it was revealed that setting a proper semantic framework for classification is fundamental to any endeavors of categorical mapping and the most important factor affecting accuracy.Lastly, this paper promotes non-parametric methods for both definition of class membership profiling based on band-specific histograms of image intensities and derivation of spatial probability via indicator kriging, a non-parametric geostatistical technique.

  6. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  7. Nitrate source apportionment using a combined dual isotope, chemical and bacterial property, and Bayesian model approach in river systems

    Science.gov (United States)

    Xia, Yongqiu; Li, Yuefei; Zhang, Xinyu; Yan, Xiaoyuan

    2017-01-01

    Nitrate (NO3-) pollution is a serious problem worldwide, particularly in countries with intensive agricultural and population activities. Previous studies have used δ15N-NO3- and δ18O-NO3- to determine the NO3- sources in rivers. However, this approach is subject to substantial uncertainties and limitations because of the numerous NO3- sources, the wide isotopic ranges, and the existing isotopic fractionations. In this study, we outline a combined procedure for improving the determination of NO3- sources in a paddy agriculture-urban gradient watershed in eastern China. First, the main sources of NO3- in the Qinhuai River were examined by the dual-isotope biplot approach, in which we narrowed the isotope ranges using site-specific isotopic results. Next, the bacterial groups and chemical properties of the river water were analyzed to verify these sources. Finally, we introduced a Bayesian model to apportion the spatiotemporal variations of the NO3- sources. Denitrification was first incorporated into the Bayesian model because denitrification plays an important role in the nitrogen pathway. The results showed that fertilizer contributed large amounts of NO3- to the surface water in traditional agricultural regions, whereas manure effluents were the dominant NO3- source in intensified agricultural regions, especially during the wet seasons. Sewage effluents were important in all three land uses and exhibited great differences between the dry season and the wet season. This combined analysis quantitatively delineates the proportion of NO3- sources from paddy agriculture to urban river water for both dry and wet seasons and incorporates isotopic fractionation and uncertainties in the source compositions.

  8. A Bayesian Approach to a Multiple-Group Latent Class-Profile Analysis: The Timing of Drinking Onset and Subsequent Drinking Behaviors among U.S. Adolescents

    Science.gov (United States)

    Chung, Hwan; Anthony, James C.

    2013-01-01

    This article presents a multiple-group latent class-profile analysis (LCPA) by taking a Bayesian approach in which a Markov chain Monte Carlo simulation is employed to achieve more robust estimates for latent growth patterns. This article describes and addresses a label-switching problem that involves the LCPA likelihood function, which has…

  9. Unsupervised Group Discovery and LInk Prediction in Relational Datasets: a nonparametric Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Koutsourelakis, P

    2007-05-03

    Clustering represents one of the most common statistical procedures and a standard tool for pattern discovery and dimension reduction. Most often the objects to be clustered are described by a set of measurements or observables e.g. the coordinates of the vectors, the attributes of people. In a lot of cases however the available observations appear in the form of links or connections (e.g. communication or transaction networks). This data contains valuable information that can in general be exploited in order to discover groups and better understand the structure of the dataset. Since in most real-world datasets, several of these links are missing, it is also useful to develop procedures that can predict those unobserved connections. In this report we address the problem of unsupervised group discovery in relational datasets. A fundamental issue in all clustering problems is that the actual number of clusters is unknown a priori. In most cases this is addressed by running the model several times assuming a different number of clusters each time and selecting the value that provides the best fit based on some criterion (ie Bayes factor in the case of Bayesian techniques). It is easily understood that it would be preferable to develop techniques that are able to number of clusters is essentially learned from that data along with the rest of model parameters. For that purpose, we adopt a nonparametric Bayesian framework which provides a very flexible modeling environment in which the size of the model i.e. the number of clusters, can adapt to the available data and readily accommodate outliers. The latter is particularly important since several groups of interest might consist of a small number of members and would most likely be smeared out by traditional modeling techniques. Finally, the proposed framework combines all the advantages of standard Bayesian techniques such as integration of prior knowledge in a principled manner, seamless accommodation of missing data

  10. A new hierarchical Bayesian approach to analyse environmental and climatic influences on debris flow occurrence

    Science.gov (United States)

    Jomelli, Vincent; Pavlova, Irina; Eckert, Nicolas; Grancher, Delphine; Brunstein, Daniel

    2015-12-01

    How can debris flow occurrences be modelled at regional scale and take both environmental and climatic conditions into account? And, of the two, which has the most influence on debris flow activity? In this paper, we try to answer these questions with an innovative Bayesian hierarchical probabilistic model that simultaneously accounts for how debris flows respond to environmental and climatic variables. In it, full decomposition of space and time effects in occurrence probabilities is assumed, revealing an environmental and a climatic trend shared by all years/catchments, respectively, clearly distinguished from residual "random" effects. The resulting regional and annual occurrence probabilities evaluated as functions of the covariates make it possible to weight the respective contribution of the different terms and, more generally, to check the model performances at different spatio-temporal scales. After suitable validation, the model can be used to make predictions at undocumented sites and could be used in further studies for predictions under future climate conditions. Also, the Bayesian paradigm easily copes with missing data, thus making it possible to account for events that may have been missed during surveys. As a case study, we extract 124 debris flow event triggered between 1970 and 2005 in 27 catchments located in the French Alps from the French national natural hazard survey and model their variability of occurrence considering environmental and climatic predictors at the same time. We document the environmental characteristics of each debris flow catchment (morphometry, lithology, land cover, and the presence of permafrost). We also compute 15 climate variables including mean temperature and precipitation between May and October and the number of rainy days with daily cumulative rainfall greater than 10/15/20/25/30/40 mm day- 1. Application of our model shows that the combination of environmental and climatic predictors explained 77% of the overall

  11. Age estimation in children by measurement of open apices in teeth with Bayesian calibration approach.

    Science.gov (United States)

    Cameriere, R; Pacifici, A; Pacifici, L; Polimeni, A; Federici, F; Cingolani, M; Ferrante, L

    2016-01-01

    Age estimation from teeth by radiological analysis, in both children and adolescents, has wide applications in several scientific and forensic fields. In 2006, Cameriere et al. proposed a regression method to estimate chronological age in children, according to measurements of open apices of permanent teeth. Although several regression models are used to analyze the relationship between age and dental development, one serious limitation is the unavoidable bias in age estimation when regression models are used. The aim of this paper is to develop a full Bayesian calibration method for age estimation in children according to the sum of open apices, S, of the seven left permanent mandibular teeth. This cross-sectional study included 2630 orthopantomographs (OPGs) from healthy living Italian subjects, aged between 4 and 17 years and with no obvious developmental abnormalities. All radiographs were in digital format and were processed by the ImageJ computer-aided drawing program. The distance between the inner side of the open apex was measured for each tooth. Dental maturity was then evaluated according to the sum of normalized open apices (S). Intra- and inter-observer agreement was satisfactory, according to an intra-class correlation coefficient of S on 50 randomly selected OPGs. Mean absolute errors were 0.72 years (standard deviation 0.60) and 0.73 years (standard deviation 0.61) in boys and girls, respectively. The mean interquartile range (MIQR) of the calibrating distribution was 1.37 years (standard deviation 0.46) and 1.51 years (standard deviation 0.52) in boys and girls, respectively. Estimate bias was βERR=-0.005 and 0.003 for boys and girls, corresponding to a bias of a few days for all individuals in the sample. Neither of the βERR values was significantly different from 0 (p>0.682). In conclusion, the Bayesian calibration method overcomes problems of bias in age estimation when regression models are used, and appears to be suitable for assessing both

  12. A Bayesian Belief Network approach to assess the potential of non wood forest products for small scale forest owners

    Science.gov (United States)

    Vacik, Harald; Huber, Patrick; Hujala, Teppo; Kurtilla, Mikko; Wolfslehner, Bernhard

    2015-04-01

    It is an integral element of the European understanding of sustainable forest management to foster the design and marketing of forest products, non-wood forest products (NWFPs) and services that go beyond the production of timber. Despite the relevance of NWFPs in Europe, forest management and planning methods have been traditionally tailored towards wood and wood products, because most forest management models and silviculture techniques were developed to ensure a sustained production of timber. Although several approaches exist which explicitly consider NWFPs as management objectives in forest planning, specific models are needed for the assessment of their production potential in different environmental contexts and for different management regimes. Empirical data supporting a comprehensive assessment of the potential of NWFPs are rare, thus making development of statistical models particularly problematic. However, the complex causal relationships between the sustained production of NWFPs, the available ecological resources, as well as the organizational and the market potential of forest management regimes are well suited for knowledge-based expert models. Bayesian belief networks (BBNs) are a kind of probabilistic graphical model that have become very popular to practitioners and scientists mainly due to the powerful probability theory involved, which makes BBNs suitable to deal with a wide range of environmental problems. In this contribution we present the development of a Bayesian belief network to assess the potential of NWFPs for small scale forest owners. A three stage iterative process with stakeholder and expert participation was used to develop the Bayesian Network within the frame of the StarTree Project. The group of participants varied in the stages of the modelling process. A core team, consisting of one technical expert and two domain experts was responsible for the entire modelling process as well as for the first prototype of the network

  13. A new approach for obtaining cosmological constraints from Type Ia Supernovae using Approximate Bayesian Computation

    CERN Document Server

    Jennings, Elise; Sako, Masao

    2016-01-01

    Cosmological parameter estimation techniques that robustly account for systematic measurement uncertainties will be crucial for the next generation of cosmological surveys. We present a new analysis method, superABC, for obtaining cosmological constraints from Type Ia supernova (SN Ia) light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. The ABC method works by using a forward model simulation of the data where systematic uncertainties can be simulated and marginalized over. A key feature of the method presented here is the use of two distinct metrics, the `Tripp' and `Light Curve' metrics, which allow us to compare the simulated data to the observed data set. The Tripp metric takes as input the parameters of models fit to each light curve with the SALT-II method, whereas the Light Curve metric uses the measured fluxes directly without model fitting. We apply the superABC sampler to a simulated data set of $\\sim$1000 SNe corresponding to the first season of the Dark En...

  14. A Bayesian approach to multi-messenger astronomy: identification of gravitational-wave host galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Fan, XiLong [School of Physics and Electronics Information, Hubei University of Education, 430205 Wuhan (China); Messenger, Christopher; Heng, Ik Siong [SUPA, School of Physics and Astronomy, University of Glasgow, Glasgow, G12 8QQ (United Kingdom)

    2014-11-01

    We present a general framework for incorporating astrophysical information into Bayesian parameter estimation techniques used by gravitational wave data analysis to facilitate multi-messenger astronomy. Since the progenitors of transient gravitational wave events, such as compact binary coalescences, are likely to be associated with a host galaxy, improvements to the source sky location estimates through the use of host galaxy information are explored. To demonstrate how host galaxy properties can be included, we simulate a population of compact binary coalescences and show that for ∼8.5% of simulations within 200 Mpc, the top 10 most likely galaxies account for a ∼50% of the total probability of hosting a gravitational wave source. The true gravitational wave source host galaxy is in the top 10 galaxy candidates ∼10% of the time. Furthermore, we show that by including host galaxy information, a better estimate of the inclination angle of a compact binary gravitational wave source can be obtained. We also demonstrate the flexibility of our method by incorporating the use of either the B or K band into our analysis.

  15. Lattice NRQCD study on in-medium bottomonium spectra using a novel Bayesian reconstruction approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seyong [Department of Physics, Sejong University, Seoul 143-747 (Korea, Republic of); Petreczky, Peter [Physics Department, Brookhaven National Laboratory, Upton, NY 11973 (United States); Rothkopf, Alexander [Institute for Theoretical Physics, Heidelberg University, Philosophenweg 16, 69120 Heidelberg (Germany)

    2016-01-22

    We present recent results on the in-medium modification of S- and P-wave bottomonium states around the deconfinement transition. Our study uses lattice QCD with N{sub f} = 2 + 1 light quark flavors to describe the non-perturbative thermal QCD medium between 140MeV < T < 249MeV and deploys lattice regularized non-relativistic QCD (NRQCD) effective field theory to capture the physics of heavy quark bound states immersed therein. The spectral functions of the {sup 3}S{sub 1} (ϒ) and {sup 3}P{sub 1} (χ{sub b1}) bottomonium states are extracted from Euclidean time Monte Carlo simulations using a novel Bayesian prescription, which provides higher accuracy than the Maximum Entropy Method. Based on a systematic comparison of interacting and free spectral functions we conclude that the ground states of both the S-wave (ϒ) and P-wave (χ{sub b1}) channel survive up to T = 249MeV. Stringent upper limits on the size of the in-medium modification of bottomonium masses and widths are provided.

  16. Modeling of Radiation Pneumonitis after Lung Stereotactic Body Radiotherapy: A Bayesian Network Approach

    CERN Document Server

    Lee, Sangkyu; Jeyaseelan, Krishinima; Faria, Sergio; Kopek, Neil; Brisebois, Pascale; Vu, Toni; Filion, Edith; Campeau, Marie-Pierre; Lambert, Louise; Del Vecchio, Pierre; Trudel, Diane; El-Sokhn, Nidale; Roach, Michael; Robinson, Clifford; Naqa, Issam El

    2015-01-01

    Background and Purpose: Stereotactic body radiotherapy (SBRT) for lung cancer accompanies a non-negligible risk of radiation pneumonitis (RP). This study presents a Bayesian network (BN) model that connects biological, dosimetric, and clinical RP risk factors. Material and Methods: 43 non-small-cell lung cancer patients treated with SBRT with 5 fractions or less were studied. Candidate RP risk factors included dose-volume parameters, previously reported clinical RP factors, 6 protein biomarkers at baseline and 6 weeks post-treatment. A BN ensemble model was built from a subset of the variables in a training cohort (N=32), and further tested in an independent validation cohort (N=11). Results: Key factors identified in the BN ensemble for predicting RP risk were ipsilateral V5, lung volume receiving more than 105% of prescription, and decrease in angiotensin converting enzyme (ACE) from baseline to 6 weeks. External validation of the BN ensemble model yielded an area under the curve of 0.8. Conclusions: The BN...

  17. Hierarchical Bayesian approach for estimating physical properties in spiral galaxies: Age Maps for M74

    CERN Document Server

    Gil, M Carmen Sánchez; Alfaro, Emilio J; Pérez, Enrique; Sarro, Luis M

    2015-01-01

    One of the fundamental goals of modern Astronomy is to estimate the physical parameters of galaxies from images in different spectral bands. We present a hierarchical Bayesian model for obtaining age maps from images in the \\Ha\\ line (taken with Taurus Tunable Filter (TTF)), ultraviolet band (far UV or FUV, from GALEX) and infrared bands (24, 70 and 160 microns ($\\mu$m), from Spitzer). As shown in S\\'anchez-Gil et al. (2011), we present the burst ages for young stellar populations in the nearby and nearly face on galaxy M74. As it is shown in the previous work, the \\Ha\\ to FUV flux ratio gives a good relative indicator of very recent star formation history (SFH). As a nascent star-forming region evolves, the \\Ha\\ line emission declines earlier than the UV continuum, leading to a decrease in the \\Ha\\/FUV ratio. Through a specific star-forming galaxy model (Starburst 99, SB99), we can obtain the corresponding theoretical ratio \\Ha\\ / FUV to compare with our observed flux ratios, and thus to estimate the ages of...

  18. Paired Comparison Analysis of the van Baaren Model Using Bayesian Approach with Noninformative Prior

    Directory of Open Access Journals (Sweden)

    Saima Altaf

    2012-03-01

    Full Text Available 800x600 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";} One technique being commonly studied these days because of its attractive applications for the comparison of several objects is the method of paired comparisons. This technique permits the ranking of the objects by means of a score, which reflects the merit of the items on a linear scale. The present study is concerned with the Bayesian analysis of a paired comparison model, namely the van Baaren model VI using noninformative uniform prior. For this purpose, the joint posterior distribution for the parameters of the model, their marginal distributions, posterior estimates (means and modes, the posterior probabilities for comparing the two treatment parameters and the predictive probabilities are obtained.

  19. A Bayesian approach to the detection of aberrations in public health surveillance data.

    Science.gov (United States)

    Stroup, D F; Thacker, S B

    1993-09-01

    The accurate and timely detection of unusual patterns in data from public health surveillance systems presents an important challenge to health workers interested in early identification of epidemics or clues to important risk factors. We apply the Kalman filter, a Bayesian method, to public health surveillance data collected in the United States to illustrate a methodology used to detect sudden, sustained changes in reported disease occurrence, changes in the rate of change of health event occurrence, as well as unusual reports or outliers. The method allows use of information external to reported data in forecasting expected numbers, information such as expert judgment, changes in case definition or reporting practices, or changes in the health event process. Results show good agreement with epidemiologically established patterns beginning early in the data series and demonstrated usefulness on a relatively short series. Because the method is unfamiliar to most practicing epidemiologists in public health, it should be compared with other techniques and made more "user-friendly" before general application.

  20. Rainfall estimation using raingages and radar — A Bayesian approach: 1. Derivation of estimators

    Science.gov (United States)

    Seo, D.-J.; Smith, J. A.

    1991-03-01

    Procedures for estimating rainfall from radar and raingage observations are constructed in a Bayesian framework. Given that the number of raingage measurements is typically very small, mean and variance of gage rainfall are treated as uncertain parameters. Under the assumption that log gage rainfall and log radar rainfall are jointly multivariate normal, the estimation problem is equivalent to lognormal co-kriging with uncertain mean and variance of the gage rainfall field. The posterior distribution is obtained under the assumption that the prior for the mean and inverse of the variance of log gage rainfall is normal-gamma 2. Estimate and estimation variance do not have closed-form expressions, but can be easily evaluated by numerically integrating two single integrals. To reduce computational burden associated with evaluating sufficient statistics for the likelihood function, an approximate form of parameter updating is given. Also, as a further approximation, the parameters are updated using raingage measurements only, yielding closed-form expressions for estimate and estimation variance in the Gaussian domain. With a reduction in the number of radar rainfall data in constructing covariance matrices, computational requirements for the estimation procedures are not significantly greater than those for simple co-kriging. Given their generality, the estimation procedures constructed in this work are considered to be applicable in various estimation problems involving an undersampled main variable and a densely sampled auxiliary variable.

  1. Using rotation measure grids to detect cosmological magnetic fields: A Bayesian approach

    Science.gov (United States)

    Vacca, V.; Oppermann, N.; Enßlin, T.; Jasche, J.; Selig, M.; Greiner, M.; Junklewitz, H.; Reinecke, M.; Brüggen, M.; Carretti, E.; Feretti, L.; Ferrari, C.; Hales, C. A.; Horellou, C.; Ideguchi, S.; Johnston-Hollitt, M.; Pizzo, R. F.; Röttgering, H.; Shimwell, T. W.; Takahashi, K.

    2016-06-01

    Determining magnetic field properties in different environments of the cosmic large-scale structure as well as their evolution over redshift is a fundamental step toward uncovering the origin of cosmic magnetic fields. Radio observations permit the study of extragalactic magnetic fields via measurements of the Faraday depth of extragalactic radio sources. Our aim is to investigate how much different extragalactic environments contribute to the Faraday depth variance of these sources. We develop a Bayesian algorithm to distinguish statistically Faraday depth variance contributions intrinsic to the source from those due to the medium between the source and the observer. In our algorithm the Galactic foreground and measurement noise are taken into account as the uncertainty correlations of the Galactic model. Additionally, our algorithm allows for the investigation of possible redshift evolution of the extragalactic contribution. This work presents the derivation of the algorithm and tests performed on mock observations. Because cosmic magnetism is one of the key science projects of the new generation of radio interferometers, we have predicted the performance of our algorithm on mock data collected with these instruments. According to our tests, high-quality catalogs of a few thousands of sources should already enable us to investigate magnetic fields in the cosmic structure.

  2. A Bayesian parameter estimation approach to pulsar time-of-arrival analysis

    CERN Document Server

    Messenger, C; Demorest, P; Ransom, S

    2011-01-01

    The increasing sensitivities of pulsar timing arrays to ultra-low frequency (nHz) gravitational waves promises to achieve direct gravitational wave detection within the next 5-10 years. While there are many parallel efforts being made in the improvement of telescope sensitivity, the detection of stable millisecond pulsars and the improvement of the timing software, there are reasons to believe that the methods used to accurately determine the time-of-arrival (TOA) of pulses from radio pulsars can be improved upon. More specifically, the determination of the uncertainties on these TOAs, which strongly affect the ability to detect GWs through pulsar timing, may be unreliable. We propose two Bayesian methods for the generation of pulsar TOAs starting from pulsar "search-mode" data and pre-folded data. These methods are applied to simulated toy-model examples and in this initial work we focus on the issue of uncertainties in the folding period. The final results of our analysis are expressed in the form of poster...

  3. Loan Supply Shocks in Macedonia: A Bayesian SVAR Approach with Sign Restrictions

    Directory of Open Access Journals (Sweden)

    Rilind Kabashi

    2016-06-01

    Full Text Available This paper analyzes the effects of loan supply, as well as aggregate demand,aggregate supply and monetary policy shocks between 1998 and 2014 in Macedonia using a structural vector autoregression model with sign restrictions and Bayesian estimation. The main results indicate that loan supply shocks have no significant effect on loan volumes and lending rates, or on economic activity and prices. The effects of monetary policy on lending activity are fairly limited, although there is some evidence that it affects lending rates more than loan volumes. Monetary policy shocks have strong effects on inflation, while the central bank reacts strongly to adverse shocks hitting the economy. Baseline results are confirmed by several robustness checks. According to historical decomposition, the lending activity was supporting economic growth before and during the crisis, but its contribution became negative during the recovery and it was a drag on growth until the end of the period. Pre-crisis GDP growth is mostly explained by supportive monetary policy. However, the restrictive monetary policy during the crisis contributed to the fall of GDP, before becoming supportive again during the early stages of the recovery. Policy rates in recent years mostly reflect subdued lending activity and aggregate supply factors, which the central bank tries to counteract with a more accommodative policy.

  4. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  5. 基于Bayes方法的炼油厂采购策略%Oil refinery procurement strategies: A Bayesian approach

    Institute of Scientific and Technical Information of China (English)

    谢智雪; 郑力

    2012-01-01

    国际原油现货市场价格剧烈波动,炼厂常用的采购策略不能应对由此产生的价格风险,该文针对这一问题提出了采购策略。以最小化总成本现值的期望为目标,根据原油采购的特点建立了动态规划模型,采用随机微积分方法分析了最优静态策略,并引入Bayes决策方法得到了动态策略。利用国际石油市场中某种基准原油近26年的历史价格数据,验证了模型的可行性和有效性。结果表明:基于Bayes方法的采购策略相比炼厂目前使用的策略能显著降低总成本,适用于原油采购问题。%Current oil refinery procurement strategies do not deal well with purchase price uncertainties on the international crude oil spot market with fluctuating prices. A dynamic programming model was developed to minimize the expected present value of the total cost of oil purchases. Stochastic calculus is used to find the static optimal solution, with a Bayesian decision framework then introduced to find the adaptive strategy. Real data for 26 years of historical prices of one marker on the crude spot market was used to test the effectiveness of the Bayesian-based procurement approach. The results show the potential of this strategy for reducing costs compared with current refinery practices.

  6. A spatiotemporal dengue fever early warning model accounting for nonlinear associations with meteorological factors: a Bayesian maximum entropy approach

    Science.gov (United States)

    Lee, Chieh-Han; Yu, Hwa-Lung; Chien, Lung-Chang

    2014-05-01

    Dengue fever has been identified as one of the most widespread vector-borne diseases in tropical and sub-tropical. In the last decade, dengue is an emerging infectious disease epidemic in Taiwan especially in the southern area where have annually high incidences. For the purpose of disease prevention and control, an early warning system is urgently needed. Previous studies have showed significant relationships between climate variables, in particular, rainfall and temperature, and the temporal epidemic patterns of dengue cases. However, the transmission of the dengue fever is a complex interactive process that mostly understated the composite space-time effects of dengue fever. This study proposes developing a one-week ahead warning system of dengue fever epidemics in the southern Taiwan that considered nonlinear associations between weekly dengue cases and meteorological factors across space and time. The early warning system based on an integration of distributed lag nonlinear model (DLNM) and stochastic Bayesian Maximum Entropy (BME) analysis. The study identified the most significant meteorological measures including weekly minimum temperature and maximum 24-hour rainfall with continuous 15-week lagged time to dengue cases variation under condition of uncertainty. Subsequently, the combination of nonlinear lagged effects of climate variables and space-time dependence function is implemented via a Bayesian framework to predict dengue fever occurrences in the southern Taiwan during 2012. The result shows the early warning system is useful for providing potential outbreak spatio-temporal prediction of dengue fever distribution. In conclusion, the proposed approach can provide a practical disease control tool for environmental regulators seeking more effective strategies for dengue fever prevention.

  7. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to neura

  8. A Bayesian Approach to Integrated Ecological and Human Health Risk Assessment for the South River, Virginia Mercury-Contaminated Site.

    Science.gov (United States)

    Harris, Meagan J; Stinson, Jonah; Landis, Wayne G

    2017-01-25

    We conducted a regional-scale integrated ecological and human health risk assessment by applying the relative risk model with Bayesian networks (BN-RRM) to a case study of the South River, Virginia mercury-contaminated site. Risk to four ecological services of the South River (human health, water quality, recreation, and the recreational fishery) was evaluated using a multiple stressor-multiple endpoint approach. These four ecological services were selected as endpoints based on stakeholder feedback and prioritized management goals for the river. The BN-RRM approach allowed for the calculation of relative risk to 14 biotic, human health, recreation, and water quality endpoints from chemical and ecological stressors in five risk regions of the South River. Results indicated that water quality and the recreational fishery were the ecological services at highest risk in the South River. Human health risk for users of the South River was low relative to the risk to other endpoints. Risk to recreation in the South River was moderate with little spatial variability among the five risk regions. Sensitivity and uncertainty analysis identified stressors and other parameters that influence risk for each endpoint in each risk region. This research demonstrates a probabilistic approach to integrated ecological and human health risk assessment that considers the effects of chemical and ecological stressors across the landscape.

  9. Bayesian modeling approach for characterizing groundwater arsenic contamination in the Mekong River basin.

    Science.gov (United States)

    Cha, YoonKyung; Kim, Young Mo; Choi, Jae-Woo; Sthiannopkao, Suthipong; Cho, Kyung Hwa

    2016-01-01

    In the Mekong River basin, groundwater from tube-wells is a major drinking water source. However, arsenic (As) contamination in groundwater resources has become a critical issue in the watershed. In this study, As species such as total As (AsTOT), As(III), and As(V), were monitored across the watershed to investigate their characteristics and inter-relationships with water quality parameters, including pH and redox potential (Eh). The data illustrated a dramatic change in the relationship between AsTOT and Eh over a specific Eh range, suggesting the importance of Eh in predicting AsTOT. Thus, a Bayesian change-point model was developed to predict AsTOT concentrations based on Eh and pH, to determine changes in the AsTOT-Eh relationship. The model captured the Eh change-point (∼-100±15mV), which was compatible with the data. Importantly, the inclusion of this change-point in the model resulted in improved model fit and prediction accuracy; AsTOT concentrations were strongly negatively related to Eh values higher than the change-point. The process underlying this relationship was subsequently posited to be the reductive dissolution of mineral oxides and As release. Overall, AsTOT showed a weak positive relationship with Eh at a lower range, similar to those commonly observed in the Mekong River basin delta. It is expected that these results would serve as a guide for establishing public health strategies in the Mekong River Basin.

  10. A Bayesian approach to modeling 2D gravity data using polygon states

    Science.gov (United States)

    Titus, W. J.; Titus, S.; Davis, J. R.

    2015-12-01

    We present a Bayesian Markov chain Monte Carlo (MCMC) method for the 2D gravity inversion of a localized subsurface object with constant density contrast. Our models have four parameters: the density contrast, the number of vertices in a polygonal approximation of the object, an upper bound on the ratio of the perimeter squared to the area, and the vertices of a polygon container that bounds the object. Reasonable parameter values can be estimated prior to inversion using a forward model and geologic information. In addition, we assume that the field data have a common random uncertainty that lies between two bounds but that it has no systematic uncertainty. Finally, we assume that there is no uncertainty in the spatial locations of the measurement stations. For any set of model parameters, we use MCMC methods to generate an approximate probability distribution of polygons for the object. We then compute various probability distributions for the object, including the variance between the observed and predicted fields (an important quantity in the MCMC method), the area, the center of area, and the occupancy probability (the probability that a spatial point lies within the object). In addition, we compare probabilities of different models using parallel tempering, a technique which also mitigates trapping in local optima that can occur in certain model geometries. We apply our method to several synthetic data sets generated from objects of varying shape and location. We also analyze a natural data set collected across the Rio Grande Gorge Bridge in New Mexico, where the object (i.e. the air below the bridge) is known and the canyon is approximately 2D. Although there are many ways to view results, the occupancy probability proves quite powerful. We also find that the choice of the container is important. In particular, large containers should be avoided, because the more closely a container confines the object, the better the predictions match properties of object.

  11. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    Directory of Open Access Journals (Sweden)

    Rasheda Arman Chowdhury

    Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  12. Bayesian Approach to Model CD137 Signaling in Human M. tuberculosis In Vitro Responses

    Science.gov (United States)

    Fernández Do Porto, Darío A.; Auzmendi, Jerónimo; Peña, Delfina; García, Verónica E.; Moffatt, Luciano

    2013-01-01

    Immune responses are qualitatively and quantitatively influenced by a complex network of receptor-ligand interactions. Among them, the CD137:CD137L pathway is known to modulate innate and adaptive human responses against Mycobacterium tuberculosis. However, the underlying mechanisms of this regulation remain unclear. In this work, we developed a Bayesian Computational Model (BCM) of in vitro CD137 signaling, devised to fit previously gathered experimental data. The BCM is fed with the data and the prior distribution of the model parameters and it returns their posterior distribution and the model evidence, which allows comparing alternative signaling mechanisms. The BCM uses a coupled system of non-linear differential equations to describe the dynamics of Antigen Presenting Cells, Natural Killer and T Cells together with the interpheron (IFN)-γ and tumor necrosis factor (TNF)-α levels in the media culture. Fast and complete mixing of the media is assumed. The prior distribution of the parameters that describe the dynamics of the immunological response was obtained from the literature and theoretical considerations Our BCM applies successively the Levenberg-Marquardt algorithm to find the maximum a posteriori likelihood (MAP); the Metropolis Markov Chain Monte Carlo method to approximate the posterior distribution of the parameters and Thermodynamic Integration to calculate the evidence of alternative hypothesis. Bayes factors provided decisive evidence favoring direct CD137 signaling on T cells. Moreover, the posterior distribution of the parameters that describe the CD137 signaling showed that the regulation of IFN-γ levels is based more on T cells survival than on direct induction. Furthermore, the mechanisms that account for the effect of CD137 signaling on TNF-α production were based on a decrease of TNF-α production by APC and, perhaps, on the increase in APC apoptosis. BCM proved to be a useful tool to gain insight on the mechanisms of CD137 signaling

  13. Quantile-based Bayesian maximum entropy approach for spatiotemporal modeling of ambient air quality levels.

    Science.gov (United States)

    Yu, Hwa-Lung; Wang, Chih-Hsin

    2013-02-05

    Understanding the daily changes in ambient air quality concentrations is important to the assessing human exposure and environmental health. However, the fine temporal scales (e.g., hourly) involved in this assessment often lead to high variability in air quality concentrations. This is because of the complex short-term physical and chemical mechanisms among the pollutants. Consequently, high heterogeneity is usually present in not only the averaged pollution levels, but also the intraday variance levels of the daily observations of ambient concentration across space and time. This characteristic decreases the estimation performance of common techniques. This study proposes a novel quantile-based Bayesian maximum entropy (QBME) method to account for the nonstationary and nonhomogeneous characteristics of ambient air pollution dynamics. The QBME method characterizes the spatiotemporal dependence among the ambient air quality levels based on their location-specific quantiles and accounts for spatiotemporal variations using a local weighted smoothing technique. The epistemic framework of the QBME method can allow researchers to further consider the uncertainty of space-time observations. This study presents the spatiotemporal modeling of daily CO and PM10 concentrations across Taiwan from 1998 to 2009 using the QBME method. Results show that the QBME method can effectively improve estimation accuracy in terms of lower mean absolute errors and standard deviations over space and time, especially for pollutants with strong nonhomogeneous variances across space. In addition, the epistemic framework can allow researchers to assimilate the site-specific secondary information where the observations are absent because of the common preferential sampling issues of environmental data. The proposed QBME method provides a practical and powerful framework for the spatiotemporal modeling of ambient pollutants.

  14. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science primar

  15. Practical Bayesian Tomography

    CERN Document Server

    Granade, Christopher; Cory, D G

    2015-01-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  16. CRAFT (complete reduction to amplitude frequency table)--robust and time-efficient Bayesian approach for quantitative mixture analysis by NMR.

    Science.gov (United States)

    Krishnamurthy, Krish

    2013-12-01

    The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented.

  17. A General Bayesian Network Approach to Analyzing Online Game Item Values and Its Influence on Consumer Satisfaction and Purchase Intention

    Science.gov (United States)

    Lee, Kun Chang; Park, Bong-Won

    Many online game users purchase game items with which to play free-to-play games. Because of a lack of research into which there is no specified framework for categorizing the values of game items, this study proposes four types of online game item values based on an analysis of literature regarding online game characteristics. It then proposes to investigate how online game users perceive satisfaction and purchase intention from the proposed four types of online game item values. Though regression analysis has been used frequently to answer this kind of research question, we propose a new approach, a General Bayesian Network (GBN), which can be performed in an understandable way without sacrificing predictive accuracy. Conventional techniques, such as regression analysis, do not provide significant explanation for this kind of problem because they are fixed to a linear structure and are limited in explaining why customers are likely to purchase game items and if they are satisfied with their purchases. In contrast, the proposed GBN provides a flexible underlying structure based on questionnaire survey data and offers robust decision support on this kind of research question by identifying its causal relationships. To illustrate the validity of GBN in solving the research question in this study, 327 valid questionnaires were analyzed using GBN with what-if and goal-seeking approaches. The experimental results were promising and meaningful in comparison with regression analysis results.

  18. Empirical and Bayesian approaches to fossil-only divergence times: A study across three reptile clades.

    Science.gov (United States)

    Turner, Alan H; Pritchard, Adam C; Matzke, Nicholas J

    2017-01-01

    Estimating divergence times on phylogenies is critical in paleontological and neontological studies. Chronostratigraphically-constrained fossils are the only direct evidence of absolute timing of species divergence. Strict temporal calibration of fossil-only phylogenies provides minimum divergence estimates, and various methods have been proposed to estimate divergences beyond these minimum values. We explore the utility of simultaneous estimation of tree topology and divergence times using BEAST tip-dating on datasets consisting only of fossils by using relaxed morphological clocks and birth-death tree priors that include serial sampling (BDSS) at a constant rate through time. We compare BEAST results to those from the traditional maximum parsimony (MP) and undated Bayesian inference (BI) methods. Three overlapping datasets were used that span 250 million years of archosauromorph evolution leading to crocodylians. The first dataset focuses on early Sauria (31 taxa, 240 chars.), the second on early Archosauria (76 taxa, 400 chars.) and the third on Crocodyliformes (101 taxa, 340 chars.). For each dataset three time-calibrated trees (timetrees) were calculated: a minimum-age timetree with node ages based on earliest occurrences in the fossil record; a 'smoothed' timetree using a range of time added to the root that is then averaged over zero-length internodes; and a tip-dated timetree. Comparisons within datasets show that the smoothed and tip-dated timetrees provide similar estimates. Only near the root node do BEAST estimates fall outside the smoothed timetree range. The BEAST model is not able to overcome limited sampling to correctly estimate divergences considerably older than sampled fossil occurrence dates. Conversely, the smoothed timetrees consistently provide node-ages far older than the strict dates or BEAST estimates for morphologically conservative sister-taxa when they sit on long ghost lineages. In this latter case, the relaxed-clock model appears to

  19. Empirical and Bayesian approaches to fossil-only divergence times: A study across three reptile clades

    Science.gov (United States)

    Turner, Alan H.; Pritchard, Adam C.; Matzke, Nicholas J.

    2017-01-01

    Estimating divergence times on phylogenies is critical in paleontological and neontological studies. Chronostratigraphically-constrained fossils are the only direct evidence of absolute timing of species divergence. Strict temporal calibration of fossil-only phylogenies provides minimum divergence estimates, and various methods have been proposed to estimate divergences beyond these minimum values. We explore the utility of simultaneous estimation of tree topology and divergence times using BEAST tip-dating on datasets consisting only of fossils by using relaxed morphological clocks and birth-death tree priors that include serial sampling (BDSS) at a constant rate through time. We compare BEAST results to those from the traditional maximum parsimony (MP) and undated Bayesian inference (BI) methods. Three overlapping datasets were used that span 250 million years of archosauromorph evolution leading to crocodylians. The first dataset focuses on early Sauria (31 taxa, 240 chars.), the second on early Archosauria (76 taxa, 400 chars.) and the third on Crocodyliformes (101 taxa, 340 chars.). For each dataset three time-calibrated trees (timetrees) were calculated: a minimum-age timetree with node ages based on earliest occurrences in the fossil record; a ‘smoothed’ timetree using a range of time added to the root that is then averaged over zero-length internodes; and a tip-dated timetree. Comparisons within datasets show that the smoothed and tip-dated timetrees provide similar estimates. Only near the root node do BEAST estimates fall outside the smoothed timetree range. The BEAST model is not able to overcome limited sampling to correctly estimate divergences considerably older than sampled fossil occurrence dates. Conversely, the smoothed timetrees consistently provide node-ages far older than the strict dates or BEAST estimates for morphologically conservative sister-taxa when they sit on long ghost lineages. In this latter case, the relaxed-clock model appears

  20. A Bayesian network approach to predicting nest presence of thefederally-threatened piping plover (Charadrius melodus) using barrier island features

    Science.gov (United States)

    Gieder, Katherina D.; Karpanty, Sarah M.; Frasera, James D.; Catlin, Daniel H.; Gutierrez, Benjamin T.; Plant, Nathaniel G.; Turecek, Aaron M.; Thieler, E. Robert

    2014-01-01

    Sea-level rise and human development pose significant threats to shorebirds, particularly for species that utilize barrier island habitat. The piping plover (Charadrius melodus) is a federally-listed shorebird that nests on barrier islands and rapidly responds to changes in its physical environment, making it an excellent species with which to model how shorebird species may respond to habitat change related to sea-level rise and human development. The uncertainty and complexity in predicting sea-level rise, the responses of barrier island habitats to sea-level rise, and the responses of species to sea-level rise and human development necessitate a modelling approach that can link species to the physical habitat features that will be altered by changes in sea level and human development. We used a Bayesian network framework to develop a model that links piping plover nest presence to the physical features of their nesting habitat on a barrier island that is impacted by sea-level rise and human development, using three years of data (1999, 2002, and 2008) from Assateague Island National Seashore in Maryland. Our model performance results showed that we were able to successfully predict nest presence given a wide range of physical conditions within the model’s dataset. We found that model predictions were more successful when the range of physical conditions included in model development was varied rather than when those physical conditions were narrow. We also found that all model predictions had fewer false negatives (nests predicted to be absent when they were actually present in the dataset) than false positives (nests predicted to be present when they were actually absent in the dataset), indicating that our model correctly predicted nest presence better than nest absence. These results indicated that our approach of using a Bayesian network to link specific physical features to nest presence will be useful for modelling impacts of sea-level rise- or human

  1. A Bayesian Hierarchical Modeling Approach to Predicting Flow in Ungauged Basins

    Science.gov (United States)

    Recent innovative approaches to identifying and applying regression-based relationships between land use patterns (such as increasing impervious surface area and decreasing vegetative cover) and rainfall-runoff model parameters represent novel and promising improvements to predic...

  2. Bayesian Visual Odometry

    Science.gov (United States)

    Center, Julian L.; Knuth, Kevin H.

    2011-03-01

    Visual odometry refers to tracking the motion of a body using an onboard vision system. Practical visual odometry systems combine the complementary accuracy characteristics of vision and inertial measurement units. The Mars Exploration Rovers, Spirit and Opportunity, used this type of visual odometry. The visual odometry algorithms in Spirit and Opportunity were based on Bayesian methods, but a number of simplifying approximations were needed to deal with onboard computer limitations. Furthermore, the allowable motion of the rover had to be severely limited so that computations could keep up. Recent advances in computer technology make it feasible to implement a fully Bayesian approach to visual odometry. This approach combines dense stereo vision, dense optical flow, and inertial measurements. As with all true Bayesian methods, it also determines error bars for all estimates. This approach also offers the possibility of using Micro-Electro Mechanical Systems (MEMS) inertial components, which are more economical, weigh less, and consume less power than conventional inertial components.

  3. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  4. A Bayesian phylogenetic approach to estimating the stability of linguistic features and the genetic biasing of tone.

    Science.gov (United States)

    Dediu, Dan

    2011-02-07

    Language is a hallmark of our species and understanding linguistic diversity is an area of major interest. Genetic factors influencing the cultural transmission of language provide a powerful and elegant explanation for aspects of the present day linguistic diversity and a window into the emergence and evolution of language. In particular, it has recently been proposed that linguistic tone-the usage of voice pitch to convey lexical and grammatical meaning-is biased by two genes involved in brain growth and development, ASPM and Microcephalin. This hypothesis predicts that tone is a stable characteristic of language because of its 'genetic anchoring'. The present paper tests this prediction using a Bayesian phylogenetic framework applied to a large set of linguistic features and language families, using multiple software implementations, data codings, stability estimations, linguistic classifications and outgroup choices. The results of these different methods and datasets show a large agreement, suggesting that this approach produces reliable estimates of the stability of linguistic data. Moreover, linguistic tone is found to be stable across methods and datasets, providing suggestive support for the hypothesis of genetic influences on its distribution.

  5. A Bayesian approach for energy-based estimation of acoustic aberrations in high intensity focused ultrasound treatment

    CERN Document Server

    Hosseini, Bamdad; Pichardo, Samuel; Constanciel, Elodie; Drake, James M; Stockie, John M

    2016-01-01

    High intensity focused ultrasound is a non-invasive method for treatment of diseased tissue that uses a beam of ultrasound in order to generate heat within a small volume. A common challenge in application of this technique is that heterogeneity of the biological medium can defocus the ultrasound beam. In this study, the problem of refocusing the beam is reduced to the Bayesian inverse problem of estimating the acoustic aberration due to the biological tissue from acoustic radiative force imaging data. The solution to this problem is a posterior probability density on the aberration which is sampled using a Metropolis-within-Gibbs algorithm. The framework is tested using both a synthetic and experimental dataset. This new approach has the ability to obtain a good estimate of the aberrations from a small dataset, as little as 32 sonication tests, which can lead to significant speedup in the treatment process. Furthermore, this framework is very flexible and can work with a wide range of sonication tests and so...

  6. A Bayesian Approach to Locating the Red Giant Branch Tip Magnitude (Part II); Distances to the Satellites of M31

    CERN Document Server

    Conn, Anthony R; Lewis, Geraint F; Parker, Quentin A; Zucker, Daniel B; Martin, Nicolas F; McConnachie, Alan W; Irwin, Mike J; Tanvir, Nial; Fardal, Mark A; Ferguson, Annette M N; Chapman, Scott C; Valls-Gabaud, David

    2012-01-01

    In `A Bayesian Approach to Locating the Red Giant Branch Tip Magnitude (PART I),' a new technique was introduced for obtaining distances using the TRGB standard candle. Here we describe a useful complement to the technique with the potential to further reduce the uncertainty in our distance measurements by incorporating a matched-filter weighting scheme into the model likelihood calculations. In this scheme, stars are weighted according to their probability of being true object members. We then re-test our modified algorithm using random-realization artificial data to verify the validity of the generated posterior probability distributions (PPDs) and proceed to apply the algorithm to the satellite system of M31, culminating in a 3D view of the system. Further to the distributions thus obtained, we apply a satellite-specific prior on the satellite distances to weight the resulting distance posterior distributions, based on the halo density profile. Thus in a single publication, using a single method, a compreh...

  7. Sentiment analysis. An example of application and evaluation of RID dictionary and Bayesian classification methods in qualitative data analysis approach

    Directory of Open Access Journals (Sweden)

    Krzysztof Tomanek

    2014-05-01

    Full Text Available The purpose of this article is to present the basic methods for classifying text data. These methods make use of achievements earned in areas such as: natural language processing, the analysis of unstructured data. I introduce and compare two analytical techniques applied to text data. The first analysis makes use of thematic vocabulary tool (sentiment analysis. The second technique uses the idea of Bayesian classification and applies, so-called, naive Bayes algorithm. My comparison goes towards grading the efficiency of use of these two analytical techniques. I emphasize solutions that are to be used to build dictionary accurate for the task of text classification. Then, I compare supervised classification to automated unsupervised analysis’ effectiveness. These results reinforce the conclusion that a dictionary which has received good evaluation as a tool for classification should be subjected to review and modification procedures if is to be applied to new empirical material. Adaptation procedures used for analytical dictionary become, in my proposed approach, the basic step in the methodology of textual data analysis.

  8. A fully Bayesian approach to the parcel-based detection-estimation of brain activity in fMRI

    Energy Technology Data Exchange (ETDEWEB)

    Makni, S. [Univ Oxford, John Radcliffe Hosp, Oxford Ctr Funct Magnet Resonance Imaging Brain, Oxford OX3 9DU (United Kingdom); Idier, J. [IRCCyN CNRS, Nantes (France); Vincent, T.; Ciuciu, P. [CEA, NeuroSpin, Gif Sur Yvette (France); Vincent, T.; Dehaene-Lambertz, G.; Ciuciu, P. [Inst Imagerie Neurofonctionnelle, IFR 49, Paris (France); Thirion, B. [INRIA Futurs, Orsay (France); Dehaene-Lambertz, G. [INSERM, NeuroSpin, U562, Gif Sur Yvette (France)

    2008-07-01

    Within-subject analysis in fMRI essentially addresses two problems, i. e., the detection of activated brain regions in response to an experimental task and the estimation of the underlying dynamics, also known as the characterisation of Hemodynamic response function (HRF). So far, both issues have been treated sequentially while it is known that the HRF model has a dramatic impact on the localisation of activations and that the HRF shape may vary from one region to another. In this paper, we conciliate both issues in a region-based joint detection-estimation framework that we develop in the Bayesian formalism. Instead of considering function basis to account for spatial variability, spatially adaptive General Linear Models are built upon region-based non-parametric estimation of brain dynamics. Regions are first identified as functionally homogeneous parcels in the mask of the grey matter using a specific procedure [Thirion, B., Flandin, G., Pinel, P., Roche, A., Ciuciu, P., Poline, J.B., August 2006. Dealing with the shortcomings of spatial normalization: Multi-subject parcellation of fMRI datasets. Hum. Brain Mapp. 27 (8), 678-693.]. Then, in each parcel, prior information is embedded to constrain this estimation. Detection is achieved by modelling activating, deactivating and non-activating voxels through mixture models within each parcel. From the posterior distribution, we infer upon the model parameters using Markov Chain Monte Carlo (MCMC) techniques. Bayesian model comparison allows us to emphasize on artificial datasets first that inhomogeneous gamma-Gaussian mixture models outperform Gaussian mixtures in terms of sensitivity/specificity trade-off and second that it is worthwhile modelling serial correlation through an AR(1) noise process at low signal-to-noise (SNR) ratio. Our approach is then validated on an fMRI experiment that studies habituation to auditory sentence repetition. This phenomenon is clearly recovered as well as the hierarchical temporal

  9. Multi-tissue computational modeling analyzes pathophysiology of type 2 diabetes in MKR mice.

    Directory of Open Access Journals (Sweden)

    Amit Kumar

    Full Text Available Computational models using metabolic reconstructions for in silico simulation of metabolic disorders such as type 2 diabetes mellitus (T2DM can provide a better understanding of disease pathophysiology and avoid high experimentation costs. There is a limited amount of computational work, using metabolic reconstructions, performed in this field for the better understanding of T2DM. In this study, a new algorithm for generating tissue-specific metabolic models is presented, along with the resulting multi-confidence level (MCL multi-tissue model. The effect of T2DM on liver, muscle, and fat in MKR mice was first studied by microarray analysis and subsequently the changes in gene expression of frank T2DM MKR mice versus healthy mice were applied to the multi-tissue model to test the effect. Using the first multi-tissue genome-scale model of all metabolic pathways in T2DM, we found out that branched-chain amino acids' degradation and fatty acids oxidation pathway is downregulated in T2DM MKR mice. Microarray data showed low expression of genes in MKR mice versus healthy mice in the degradation of branched-chain amino acids and fatty-acid oxidation pathways. In addition, the flux balance analysis using the MCL multi-tissue model showed that the degradation pathways of branched-chain amino acid and fatty acid oxidation were significantly downregulated in MKR mice versus healthy mice. Validation of the model was performed using data derived from the literature regarding T2DM. Microarray data was used in conjunction with the model to predict fluxes of various other metabolic pathways in the T2DM mouse model and alterations in a number of pathways were detected. The Type 2 Diabetes MCL multi-tissue model may explain the high level of branched-chain amino acids and free fatty acids in plasma of Type 2 Diabetic subjects from a metabolic fluxes perspective.

  10. An Alternative Bayesian Approach to Structural Breaks in Time Series Models

    NARCIS (Netherlands)

    S. van den Hauwe (Sjoerd); R. Paap (Richard); D.J.C. van Dijk (Dick)

    2011-01-01

    textabstractWe propose a new approach to deal with structural breaks in time series models. The key contribution is an alternative dynamic stochastic specification for the model parameters which describes potential breaks. After a break new parameter values are generated from a so-called baseline pr

  11. Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE) using a Hierarchical Bayesian Approach

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole;

    2011-01-01

    We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical s...

  12. Bayesian Intersubjectivity and Quantum Theory

    Science.gov (United States)

    Pérez-Suárez, Marcos; Santos, David J.

    2005-02-01

    Two of the major approaches to probability, namely, frequentism and (subjectivistic) Bayesian theory, are discussed, together with the replacement of frequentist objectivity for Bayesian intersubjectivity. This discussion is then expanded to Quantum Theory, as quantum states and operations can be seen as structural elements of a subjective nature.

  13. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  14. A Bayesian Approach to Discovering Truth from Conflicting Sources for Data Integration

    CERN Document Server

    Zhao, Bo; Gemmell, Jim; Han, Jiawei

    2012-01-01

    In practical data integration systems, it is common for the data sources being integrated to provide conflicting information about the same entity. Consequently, a major challenge for data integration is to derive the most complete and accurate integrated records from diverse and sometimes conflicting sources. We term this challenge the truth finding problem. We observe that some sources are generally more reliable than others, and therefore a good model of source quality is the key to solving the truth finding problem. In this work, we propose a probabilistic graphical model that can automatically infer true records and source quality without any supervision. In contrast to previous methods, our principled approach leverages a generative process of two types of errors (false positive and false negative) by modeling two different aspects of source quality. In so doing, ours is also the first approach designed to merge multi-valued attribute types. Our method is scalable, due to an efficient sampling-based inf...

  15. Detecting duplicates in a homicide registry using a Bayesian partitioning approach

    OpenAIRE

    Sadinle, Mauricio

    2014-01-01

    Finding duplicates in homicide registries is an important step in keeping an accurate account of lethal violence. This task is not trivial when unique identifiers of the individuals are not available, and it is especially challenging when records are subject to errors and missing values. Traditional approaches to duplicate detection output independent decisions on the coreference status of each pair of records, which often leads to nontransitive decisions that have to be reconciled in some ad...

  16. A Bayesian Approach for Analysis of Whole-Genome Bisulphite Sequencing Data Identifies Disease-Associated Changes in DNA Methylation.

    Science.gov (United States)

    Rackham, Owen J L; Langley, Sarah R; Oates, Thomas; Vradi, Eleni; Harmston, Nathan; Srivastava, Prashant K; Behmoaras, Jacques; Dellaportas, Petros; Bottolo, Leonardo; Petretto, Enrico

    2017-02-17

    DNA methylation is a key epigenetic modification involved in gene regulation whose contribution to disease susceptibility remains to be fully understood. Here, we present a novel Bayesian smoothing approach (called ABBA) to detect differentially methylated regions (DMRs) from whole-genome bisulphite sequencing (WGBS). We also show how this approach can be leveraged to identify disease-associated changes in DNA methylation, suggesting mechanisms through which these alterations might affect disease. From a data modeling perspective, ABBA has the distinctive feature of automatically adapting to different correlation structures in CpG methylation levels across the genome whilst taking into account the distance between CpG sites as a covariate. Our simulation study shows that ABBA has greater power to detect DMRs than existing methods, providing an accurate identification of DMRs in the large majority of simulated cases. To empirically demonstrate the method's efficacy in generating biological hypotheses, we performed WGBS of primary macrophages derived from an experimental rat system of glomerulonephritis and used ABBA to identify >1,000 disease-associated DMRs. Investigation of these DMRs revealed differential DNA methylation localized to a 600bp region in the promoter of the Ifitm3 gene. This was confirmed by ChIP-seq and RNA-seq analyses, showing differential transcription factor binding at the Ifitm3 promoter by JunD (an established determinant of glomerulonephritis) and a consistent change in Ifitm3 expression. Our ABBA analysis allowed us to propose a new role for Ifitm3 in the pathogenesis of glomerulonephritis via a mechanism involving promoter hypermethylation that is associated with Ifitm3 repression in the rat strain susceptible to glomerulonephritis.

  17. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have b...

  18. Bayesian statistics

    OpenAIRE

    新家, 健精

    2013-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  19. An urban flood risk assessment method using the Bayesian Network approach

    DEFF Research Database (Denmark)

    Åström, Helena Lisa Alexandra

    to flooding, because these areas comprise large amounts of valuable assets. Flooding in urban areas can grow into significant disruptions and national threats unless appropriate flood risk management (FRM) plans are developed and timely adaptation options are implemented. FRM is a well-established process....... Currently, FRA is mainly based on single hazard events, but with expected climate change impacts there may be a need to include several hazards into FRA to assure that risk is described correctly for identification of important adaptation. This thesis shows that IDs may serve as a good approach...

  20. Bayesian Missile System Reliability from Point Estimates

    Science.gov (United States)

    2014-10-28

    OCT 2014 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Bayesian Missile System Reliability from Point Estimates 5a. CONTRACT...Principle (MEP) to convert point estimates to probability distributions to be used as priors for Bayesian reliability analysis of missile data, and...illustrate this approach by applying the priors to a Bayesian reliability model of a missile system. 15. SUBJECT TERMS priors, Bayesian , missile

  1. Use of genomic DNA control features and predicted operon structure in microarray data analysis: ArrayLeaRNA – a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Pin Carmen

    2007-11-01

    Full Text Available Abstract Background Microarrays are widely used for the study of gene expression; however deciding on whether observed differences in expression are significant remains a challenge. Results A computing tool (ArrayLeaRNA has been developed for gene expression analysis. It implements a Bayesian approach which is based on the Gumbel distribution and uses printed genomic DNA control features for normalization and for estimation of the parameters of the Bayesian model and prior knowledge from predicted operon structure. The method is compared with two other approaches: the classical LOWESS normalization followed by a two fold cut-off criterion and the OpWise method (Price, et al. 2006. BMC Bioinformatics. 7, 19, a published Bayesian approach also using predicted operon structure. The three methods were compared on experimental datasets with prior knowledge of gene expression. With ArrayLeaRNA, data normalization is carried out according to the genomic features which reflect the results of equally transcribed genes; also the statistical significance of the difference in expression is based on the variability of the equally transcribed genes. The operon information helps the classification of genes with low confidence measurements. ArrayLeaRNA is implemented in Visual Basic and freely available as an Excel add-in at http://www.ifr.ac.uk/safety/ArrayLeaRNA/ Conclusion We have introduced a novel Bayesian model and demonstrated that it is a robust method for analysing microarray expression profiles. ArrayLeaRNA showed a considerable improvement in data normalization, in the estimation of the experimental variability intrinsic to each hybridization and in the establishment of a clear boundary between non-changing and differentially expressed genes. The method is applicable to data derived from hybridizations of labelled cDNA samples as well as from hybridizations of labelled cDNA with genomic DNA and can be used for the analysis of datasets where

  2. Accounting for non-independent detection when estimating abundance of organisms with a Bayesian approach

    Science.gov (United States)

    Martin, Julien; Royle, J. Andrew; MacKenzie, Darryl I.; Edwards, Holly H.; Kery, Marc; Gardner, Beth

    2011-01-01

    Summary 1. Binomial mixture models use repeated count data to estimate abundance. They are becoming increasingly popular because they provide a simple and cost-effective way to account for imperfect detection. However, these models assume that individuals are detected independently of each other. This assumption may often be violated in the field. For instance, manatees (Trichechus manatus latirostris) may surface in turbid water (i.e. become available for detection during aerial surveys) in a correlated manner (i.e. in groups). However, correlated behaviour, affecting the non-independence of individual detections, may also be relevant in other systems (e.g. correlated patterns of singing in birds and amphibians). 2. We extend binomial mixture models to account for correlated behaviour and therefore to account for non-independent detection of individuals. We simulated correlated behaviour using beta-binomial random variables. Our approach can be used to simultaneously estimate abundance, detection probability and a correlation parameter. 3. Fitting binomial mixture models to data that followed a beta-binomial distribution resulted in an overestimation of abundance even for moderate levels of correlation. In contrast, the beta-binomial mixture model performed considerably better in our simulation scenarios. We also present a goodness-of-fit procedure to evaluate the fit of beta-binomial mixture models. 4. We illustrate our approach by fitting both binomial and beta-binomial mixture models to aerial survey data of manatees in Florida. We found that the binomial mixture model did not fit the data, whereas there was no evidence of lack of fit for the beta-binomial mixture model. This example helps illustrate the importance of using simulations and assessing goodness-of-fit when analysing ecological data with N-mixture models. Indeed, both the simulations and the goodness-of-fit procedure highlighted the limitations of the standard binomial mixture model for aerial

  3. A Trans-dimensional Bayesian Approach to Pulsar Timing Noise Analysis

    CERN Document Server

    Ellis, Justin

    2016-01-01

    The modeling of intrinsic noise in pulsar timing residual data is of crucial importance for Gravitational Wave (GW) detection and pulsar timing (astro)physics in general. The noise budget in pulsars is a collection of several well studied effects including radiometer noise, pulse-phase jitter noise, dispersion measure (DM) variations, and low frequency spin noise. However, as pulsar timing data continues to improve, non-stationary and non-powerlaw noise terms are beginning to manifest which are not well modeled by current noise analysis techniques. In this work we use a trans-dimensional approach to model these non-stationary and non-powerlaw effects through the use of a wavelet basis and an interpolation based adaptive spectral modeling. In both cases, the number of wavelets and the number of control points in the interpolated spectrum are free parameters that are constrained by the data and then marginalized over in the final inferences, thus fully incorporating our ignorance of the noise model. We show tha...

  4. Stochastic models with heteroskedasticity: a Bayesian approach for Ibovespa returns - doi: 10.4025/actascitechnol.v35i2.13547

    Directory of Open Access Journals (Sweden)

    Sandra Cristina de Oliveira

    2013-04-01

    Full Text Available Current research compares the Bayesian estimates obtained for the parameters of processes of ARCH family with normal and Student’s t distributions for the conditional distribution of the return series. A non-informative prior distribution was adopted and a reparameterization of models under analysis was taken into account to map parameters’ space into real space. The procedure adopts a normal prior distribution for the transformed parameters. The posterior summaries were obtained by Monte Carlo Markov Chain (MCMC simulation methods. The methodology was evaluated by a series of Bovespa Index returns and the predictive ordinate criterion was employed to select the best adjustment model to the data. Results show that, as a rule, the proposed Bayesian approach provides satisfactory estimates and that the GARCH process with Student’s t distribution adjusted better to the data.  

  5. A Bayesian Approach to Estimate the Size and Structure of the Broad-line Region in Active Galactic Nuclei Using Reverberation Mapping Data

    CERN Document Server

    Li, Yan-Rong; Ho, Luis C; Du, Pu; Bai, Jin-Ming

    2013-01-01

    This is the first paper in a series devoted to systematic study of the size and structure of the broad-line region (BLR) in active galactic nuclei (AGNs) using reverberation mapping (RM) data. We employ a recently developed Bayesian approach that statistically describes the variabibility as a damped random walk process and delineates the BLR structure using a flexible disk geometry that can account for a variety of shapes, including disks, rings, shells, and spheres. We allow for the possibility that the line emission may respond non-linearly to the continuum, and we detrend the light curves when there is clear evidence for secular variation. We use a Markov Chain Monte Carlo implementation based on Bayesian statistics to recover the parameters and uncertainties for the BLR model. The corresponding transfer function is obtained self-consistently. We tentatively constrain the virial factor used to estimate black hole masses; more accurate determinations will have to await velocity-resolved RM data. Application...

  6. Exploring links between juvenile offenders and social disorganization at a large map scale: a Bayesian spatial modeling approach

    Science.gov (United States)

    Law, Jane; Quick, Matthew

    2013-01-01

    This paper adopts a Bayesian spatial modeling approach to investigate the distribution of young offender residences in York Region, Southern Ontario, Canada, at the census dissemination area level. Few geographic researches have analyzed offender (as opposed to offense) data at a large map scale (i.e., using a relatively small areal unit of analysis) to minimize aggregation effects. Providing context is the social disorganization theory, which hypothesizes that areas with economic deprivation, high population turnover, and high ethnic heterogeneity exhibit social disorganization and are expected to facilitate higher instances of young offenders. Non-spatial and spatial Poisson models indicate that spatial methods are superior to non-spatial models with respect to model fit and that index of ethnic heterogeneity, residential mobility (1 year moving rate), and percentage of residents receiving government transfer payments are, respectively, the most significant explanatory variables related to young offender location. These findings provide overwhelming support for social disorganization theory as it applies to offender location in York Region, Ontario. Targeting areas where prevalence of young offenders could or could not be explained by social disorganization through decomposing the estimated risk map are helpful for dealing with juvenile offenders in the region. Results prompt discussion into geographically targeted police services and young offender placement pertaining to risk of recidivism. We discuss possible reasons for differences and similarities between the previous findings (that analyzed offense data and/or were conducted at a smaller map scale) and our findings, limitations of our study, and practical outcomes of this research from a law enforcement perspective.

  7. Improving estimates of N2O emissions for western and central Europe using a Bayesian inversion approach

    Science.gov (United States)

    Thompson, R. L.; Gerbig, C.; Roedenbeck, C.; Heimann, M.

    2009-04-01

    The nitrous oxide (N2O) mixing ratio has been increasing in the atmosphere since the industrial revolution, from 270 ppb in 1750 to 320 ppb in 2007 with a steady growth rate of around 0.26% since the early 1980's. The increase in N2O is worrisome for two main reasons. First, it is a greenhouse gas; this means that its atmospheric increase translates to an enhancement in radiative forcing of 0.16 ± 0.02 Wm-2 making it currently the fourth most important long-lived greenhouse gas and is predicted to soon overtake CFC's to become the third most important. Second, it plays an important role in stratospheric ozone chemistry. Human activities are the primary cause of the atmospheric N2O increase. The largest anthropogenic source of N2O is from the use of N-fertilizers in agriculture but fossil fuel combustion and industrial processes, such as adipic and nitric acid production, are also important. We present a Bayesian inversion approach for estimating N2O fluxes over central and western Europe using high frequency in-situ concentration data from the Ochsenkopf tall tower (50 °01′N, 11 °48′, 1022 masl). For the inversion, we employ a Lagrangian-type transport model, STILT, which provides source-receptor relationships at 10 km using ECMWF meteorological data. The a priori flux estimates used were from IER, for anthropogenic, and GEIA, for natural fluxes. N2O fluxes were retrieved monthly at 2 x 2 degree spatial resolution for 2007. The retrieved N2O fluxes showed significantly more spatial heterogeneity than in the a priori field and considerable seasonal variability. The timing of peak emissions was different for different regions but in general the months with the strongest emissions were May and August. Overall, the retrieved flux (anthropogenic and natural) was lower than in the a priori field.

  8. A Bayesian cost-benefit approach to the determination of sample size in clinical trials.

    Science.gov (United States)

    Kikuchi, Takashi; Pezeshk, Hamid; Gittins, John

    2008-01-15

    Current practice for sample size computations in clinical trials is largely based on frequentist or classical methods. These methods have the drawback of requiring a point estimate of the variance of the treatment effect and are based on arbitrary settings of type I and II errors. They also do not directly address the question of achieving the best balance between the cost of the trial and the possible benefits from using the new treatment, and fail to consider the important fact that the number of users depends on the evidence for improvement compared with the current treatment. Our approach, Behavioural Bayes (or BeBay for short), assumes that the number of patients switching to the new medical treatment depends on the strength of the evidence that is provided by clinical trials, and takes a value between zero and the number of potential patients. The better a new treatment, the more the number of patients who want to switch to it and the more the benefit is obtained. We define the optimal sample size to be the sample size that maximizes the expected net benefit resulting from a clinical trial. Gittins and Pezeshk (Drug Inf. Control 2000; 34:355-363; The Statistician 2000; 49(2):177-187) used a simple form of benefit function and assumed paired comparisons between two medical treatments and that the variance of the treatment effect is known. We generalize this setting, by introducing a logistic benefit function, and by extending the more usual unpaired case, without assuming the variance to be known.

  9. Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions

    Directory of Open Access Journals (Sweden)

    Richard M. Todd

    1988-03-01

    Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.

  10. Hydrological budget of Lake Chad: assessment of lake-groundwater interaction by coupling Bayesian approach and chemical budget

    Science.gov (United States)

    Bouchez, Camille; Goncalves, Julio; Deschamps, Pierre; Seidel, Jean-Luc; Doumnang, Jean-Claude; Sylvestre, Florence

    2014-05-01

    Estimation of lake-groundwater interactions is a crucial step to constrain water balance of lacustrine and aquifer systems. Located in the Sahel, the Lake Chad is at the center of an endorheic basin of 2,5.106 km2. One of the most remarkable features of this terminal lake is that, despite the semi-arid context and high evaporation rates of the area, its waters are fresh. It is proposed in the literature that the solutes are evacuated in the underlying quaternary aquifer bearing witness to the importance of surface water and groundwater exchanges for the chemical regulation of the lake. The water balance of this system is still not fully understood. The respective roles of evaporation versus infiltration into the quaternary aquifer are particularly under constrained. To assess lake-groundwater flows, we used the previous conceptual hydrological model of the lake Chad proposed by Bader et al. (Hydrological Sciences Journal, 2011). This model involves six parameters including infiltration rate. A probabilistic inversion of parameters, based on an exploration of the parameters space through a Metropolis algorithm (a Monte Carlo Markov Chain method), allows the construction of an a posteriori Probability Density Function of each parameter yielding to the best fits between observed lake levels and simulated. Then, a chemical budget of a conservative element, such as chloride, is introduced in the water balance model using the optimal parameters resulting from the Bayesian inverse approach. The model simulates lake level and chloride concentration variations of lake Chad from 1956 up to 2008. Simulated lake levels are in overall agreement with the observations, with a Nash-Sutcliffe efficiency coefficient above 0.94 for all sets of parameters retained. The infiltration value, obtained by such probabilistic inversion approach, accounts for 120±20 mm/yr, representing 5% of the total outputs of the lake. However, simulated chloride concentrations are overestimated in

  11. A Bayesian approach for three-dimensional markerless tumor tracking using kV imaging during lung radiotherapy

    Science.gov (United States)

    Shieh, Chun-Chien; Caillet, Vincent; Dunbar, Michelle; Keall, Paul J.; Booth, Jeremy T.; Hardcastle, Nicholas; Haddad, Carol; Eade, Thomas; Feain, Ilana

    2017-04-01

    The ability to monitor tumor motion without implanted markers can potentially enable broad access to more accurate and precise lung radiotherapy. A major challenge is that kilovoltage (kV) imaging based methods are rarely able to continuously track the tumor due to the inferior tumor visibility on 2D kV images. Another challenge is the estimation of 3D tumor position based on only 2D imaging information. The aim of this work is to address both challenges by proposing a Bayesian approach for markerless tumor tracking for the first time. The proposed approach adopts the framework of the extended Kalman filter, which combines a prediction and measurement steps to make the optimal tumor position update. For each imaging frame, the tumor position is first predicted by a respiratory-correlated model. The 2D tumor position on the kV image is then measured by template matching. Finally, the prediction and 2D measurement are combined based on the 3D distribution of tumor positions in the past 10 s and the estimated uncertainty of template matching. To investigate the clinical feasibility of the proposed method, a total of 13 lung cancer patient datasets were used for retrospective validation, including 11 cone-beam CT scan pairs and two stereotactic ablative body radiotherapy cases. The ground truths for tumor motion were generated from the the 3D trajectories of implanted markers or beacons. The mean, standard deviation, and 95th percentile of the 3D tracking error were found to range from 1.6–2.9 mm, 0.6–1.5 mm, and 2.6–5.8 mm, respectively. Markerless tumor tracking always resulted in smaller errors compared to the standard of care. The improvement was the most pronounced in the superior–inferior (SI) direction, with up to 9.5 mm reduction in the 95th-percentile SI error for patients with  >10 mm 5th-to-95th percentile SI tumor motion. The percentage of errors with 3D magnitude  <5 mm was 96.5% for markerless tumor tracking and 84.1% for the

  12. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  13. A Measure of Systems Engineering Effectiveness in Government Acquisition of Complex Information Systems: A Bayesian Belief Network-Based Approach

    Science.gov (United States)

    Doskey, Steven Craig

    2014-01-01

    This research presents an innovative means of gauging Systems Engineering effectiveness through a Systems Engineering Relative Effectiveness Index (SE REI) model. The SE REI model uses a Bayesian Belief Network to map causal relationships in government acquisitions of Complex Information Systems (CIS), enabling practitioners to identify and…

  14. Bayesian Theory

    CERN Document Server

    Bernardo, Jose M

    2000-01-01

    This highly acclaimed text, now available in paperback, provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Information-theoretic concepts play a central role in the development of the theory, which provides, in particular, a detailed discussion of the problem of specification of so-called prior ignorance . The work is written from the authors s committed Bayesian perspective, but an overview of non-Bayesian theories is also provided, and each chapter contains a wide-ranging critica

  15. A genetic algorithm-Bayesian network approach for the analysis of metabolomics and spectroscopic data: application to the rapid identification of Bacillus spores and classification of Bacillus species

    Directory of Open Access Journals (Sweden)

    Goodacre Royston

    2011-01-01

    Full Text Available Abstract Background The rapid identification of Bacillus spores and bacterial identification are paramount because of their implications in food poisoning, pathogenesis and their use as potential biowarfare agents. Many automated analytical techniques such as Curie-point pyrolysis mass spectrometry (Py-MS have been used to identify bacterial spores giving use to large amounts of analytical data. This high number of features makes interpretation of the data extremely difficult We analysed Py-MS data from 36 different strains of aerobic endospore-forming bacteria encompassing seven different species. These bacteria were grown axenically on nutrient agar and vegetative biomass and spores were analyzed by Curie-point Py-MS. Results We develop a novel genetic algorithm-Bayesian network algorithm that accurately identifies sand selects a small subset of key relevant mass spectra (biomarkers to be further analysed. Once identified, this subset of relevant biomarkers was then used to identify Bacillus spores successfully and to identify Bacillus species via a Bayesian network model specifically built for this reduced set of features. Conclusions This final compact Bayesian network classification model is parsimonious, computationally fast to run and its graphical visualization allows easy interpretation of the probabilistic relationships among selected biomarkers. In addition, we compare the features selected by the genetic algorithm-Bayesian network approach with the features selected by partial least squares-discriminant analysis (PLS-DA. The classification accuracy results show that the set of features selected by the GA-BN is far superior to PLS-DA.

  16. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  17. Advances in Bayesian Modeling in Educational Research

    Science.gov (United States)

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  18. A Bayesian approach for decision making on the identification of genes with different expression levels: an application to Escherichia coli bacterium data.

    Science.gov (United States)

    Saraiva, Erlandson F; Louzada, Francisco; Milan, Luís A; Meira, Silvana; Cobre, Juliana

    2012-01-01

    A common interest in gene expression data analysis is to identify from a large pool of candidate genes the genes that present significant changes in expression levels between a treatment and a control biological condition. Usually, it is done using a statistic value and a cutoff value that are used to separate the genes differentially and nondifferentially expressed. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating sequentially credibility intervals from predictive densities which are constructed using the sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained report evidence that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a well-known publicly available data set on Escherichia coli bacterium.

  19. Perceptual stimulus-A Bayesian-based integration of multi-visual-cue approach and its application

    Institute of Scientific and Technical Information of China (English)

    XUE JianRu; ZHENG NanNing; ZHONG XiaoPin; PING LinJiang

    2008-01-01

    With the view that visual cue could be taken as a kind of stimulus, the study of the mechanism in the visual perception process by using visual cues in their probabilistic representation eventually leads to a class of statistical integration of multiple visual cues (IMVC) methods which have been applied widely in perceptual grouping, video analysis, and other basic problems in computer vision. In this paper, a survey on the basic ideas and recent advances of IMVC methods is presented, and much focus is on the models and algorithms of IMVC for video analysis within the framework of Bayesian estimation. Furthermore, two typical problems in video analysis, robust visual tracking and "switching problem" in multi-target tracking (MTT) are taken as test beds to verify a series of Bayesian-based IMVC methods proposed by the authors. Furthermore, the relations between the statistical IMVC and the visual per-ception process, as well as potential future research work for IMVC, are discussed.

  20. A Bayesian approach to the evaluation of risk-based microbiological criteria for Campylobacter in broiler meat

    DEFF Research Database (Denmark)

    Ranta, Jukka; Lindqvist, Roland; Hansson, Ingrid

    2015-01-01

    of MC involves several uncertainties that are related to both the underlying Quantitative Microbiological Risk Assessment (QMRA) model and the production-specific sample data on the prevalence and concentrations of microbes in production batches. We used Bayesian modeling for statistical inference......Shifting from traditional hazard-based food safety management toward risk-based management requires statistical methods for evaluating intermediate targets in food production, such as microbiological criteria (MC), in terms of their effects on human risk of illness. A fully risk-based evaluation...... and evidence synthesis of two sample data sets. Thus, parameter uncertainty was represented by a joint posterior distribution, which we then used to predict the risk and to evaluate the criteria for acceptance of production batches. We also applied the Bayesian model to compare alternative criteria, accounting...

  1. A novel Bayesian approach to quantify clinical variables and to determine their spectroscopic counterparts in 1H NMR metabonomic data

    Directory of Open Access Journals (Sweden)

    Kaski Kimmo

    2007-05-01

    Full Text Available Abstract Background A key challenge in metabonomics is to uncover quantitative associations between multidimensional spectroscopic data and biochemical measures used for disease risk assessment and diagnostics. Here we focus on clinically relevant estimation of lipoprotein lipids by 1H NMR spectroscopy of serum. Results A Bayesian methodology, with a biochemical motivation, is presented for a real 1H NMR metabonomics data set of 75 serum samples. Lipoprotein lipid concentrations were independently obtained for these samples via ultracentrifugation and specific biochemical assays. The Bayesian models were constructed by Markov chain Monte Carlo (MCMC and they showed remarkably good quantitative performance, the predictive R-values being 0.985 for the very low density lipoprotein triglycerides (VLDL-TG, 0.787 for the intermediate, 0.943 for the low, and 0.933 for the high density lipoprotein cholesterol (IDL-C, LDL-C and HDL-C, respectively. The modelling produced a kernel-based reformulation of the data, the parameters of which coincided with the well-known biochemical characteristics of the 1H NMR spectra; particularly for VLDL-TG and HDL-C the Bayesian methodology was able to clearly identify the most characteristic resonances within the heavily overlapping information in the spectra. For IDL-C and LDL-C the resulting model kernels were more complex than those for VLDL-TG and HDL-C, probably reflecting the severe overlap of the IDL and LDL resonances in the 1H NMR spectra. Conclusion The systematic use of Bayesian MCMC analysis is computationally demanding. Nevertheless, the combination of high-quality quantification and the biochemical rationale of the resulting models is expected to be useful in the field of metabonomics.

  2. Evaluation of a Bayesian Approach to Estimate Vancomycin Exposure in Obese Patients with Limited Pharmacokinetic Sampling: A Pilot Study.

    Science.gov (United States)

    Carreno, Joseph J; Lomaestro, Ben; Tietjan, John; Lodise, Thomas P

    2017-03-13

    This study evaluated the predictive performance of a Bayesian PK estimation method (ADAPT V) to estimate the 24-hour vancomycin area under the curve estimation (AUC) with limited PK sampling in adult obese patients receiving vancomycin for suspected or confirmed Gram-positive infections. This was an IRB-approved prospective evaluation of 12 patients. Patients had a median (95% CI) age of 61 years (39 - 71), creatinine clearance of 86 mL/min (75 - 120), and body mass index of 45 kg/m(2) (40 - 52). For each patient, five PK concentrations were measured and 4 different vancomycin population PK models were used as Bayesian priors to estimate the estimate vancomycin AUC (AUCFULL). Using each PK model as a prior, data-depleted PK subsets were used to estimate the 24-hour AUC (i.e. peak and trough data [AUCPT], midpoint and trough data [AUCMT], and trough only data [AUCT]). The 24-hour AUC derived from the full data set (AUCFULL) was compared to AUC derived from data depleted subsets (AUCPT, AUCMT, AUCT) for each model. For the 4 sets of analyses, AUCFULL estimates ranged from 437 to 489 mg-h/L. The AUCPT provided the best approximation of the AUCFULL; AUCMT and AUCT tended to overestimate AUCFULL Further prospective studies are needed to evaluate the impact of AUC monitoring in clinical practice but the findings from this study suggest the vancomycin AUC can be estimated good precision and accuracy with limited PK sampling using Bayesian PK estimation software.

  3. A Bayesian Network-Based Approach to Selection of Intervention Points in the Mitogen-Activated Protein Kinase Plant Defense Response Pathway.

    Science.gov (United States)

    Venkat, Priya S; Narayanan, Krishna R; Datta, Aniruddha

    2017-04-01

    An important problem in computational biology is the identification of potential points of intervention that can lead to modified network behavior in a genetic regulatory network. We consider the problem of deducing the effect of individual genes on the behavior of the network in a statistical framework. In this article, we make use of biological information from the literature to develop a Bayesian network and introduce a method to estimate parameters of this network using data that are relevant to the biological phenomena under study. Then, we give a novel approach to select significant nodes in the network using a decision-theoretic approach. The proposed method is applied to the analysis of the mitogen-activated protein kinase pathway in the plant defense response to pathogens. Results from applying the method to experimental data show that the proposed approach is effective in selecting genes that play crucial roles in the biological phenomenon being studied.

  4. Bayesian signaling

    OpenAIRE

    Hedlund, Jonas

    2014-01-01

    This paper introduces private sender information into a sender-receiver game of Bayesian persuasion with monotonic sender preferences. I derive properties of increasing differences related to the precision of signals and use these to fully characterize the set of equilibria robust to the intuitive criterion. In particular, all such equilibria are either separating, i.e., the sender's choice of signal reveals his private information to the receiver, or fully disclosing, i.e., the outcome of th...

  5. Bayesian Monitoring.

    OpenAIRE

    Kirstein, Roland

    2005-01-01

    This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...

  6. Bayesian inference for generalized linear mixed models with predictors subject to detection limits: an approach that leverages information from auxiliary variables.

    Science.gov (United States)

    Yue, Yu Ryan; Wang, Xiao-Feng

    2016-05-10

    This paper is motivated from a retrospective study of the impact of vitamin D deficiency on the clinical outcomes for critically ill patients in multi-center critical care units. The primary predictors of interest, vitamin D2 and D3 levels, are censored at a known detection limit. Within the context of generalized linear mixed models, we investigate statistical methods to handle multiple censored predictors in the presence of auxiliary variables. A Bayesian joint modeling approach is proposed to fit the complex heterogeneous multi-center data, in which the data information is fully used to estimate parameters of interest. Efficient Monte Carlo Markov chain algorithms are specifically developed depending on the nature of the response. Simulation studies demonstrate the outperformance of the proposed Bayesian approach over other existing methods. An application to the data set from the vitamin D deficiency study is presented. Possible extensions of the method regarding the absence of auxiliary variables, semiparametric models, as well as the type of censoring are also discussed.

  7. Zero-modified Poisson model: Bayesian approach, influence diagnostics, and an application to a Brazilian leptospirosis notification data.

    Science.gov (United States)

    Conceição, Katiane S; Andrade, Marinho G; Louzada, Francisco

    2013-09-01

    In this paper, a Bayesian method for inference is developed for the zero-modified Poisson (ZMP) regression model. This model is very flexible for analyzing count data without requiring any information about inflation or deflation of zeros in the sample. A general class of prior densities based on an information matrix is considered for the model parameters. A sensitivity study to detect influential cases that can change the results is performed based on the Kullback-Leibler divergence. Simulation studies are presented in order to illustrate the performance of the developed methodology. Two real datasets on leptospirosis notification in Bahia State (Brazil) are analyzed using the proposed methodology for the ZMP model.

  8. Estimating seed and pollen movement in a monoecious plant: a hierarchical Bayesian approach integrating genetic and ecological data.

    Science.gov (United States)

    Moran, Emily V; Clark, James S

    2011-03-01

    The scale of seed and pollen movement in plants has a critical influence on population dynamics and interspecific interactions, as well as on their capacity to respond to environmental change through migration or local adaptation. However, dispersal can be challenging to quantify. Here, we present a Bayesian model that integrates genetic and ecological data to simultaneously estimate effective seed and pollen dispersal parameters and the parentage of sampled seedlings. This model is the first developed for monoecious plants that accounts for genotyping error and treats dispersal from within and beyond a plot in a fully consistent manner. The flexible Bayesian framework allows the incorporation of a variety of ecological variables, including individual variation in seed production, as well as multiple sources of uncertainty. We illustrate the method using data from a mixed population of red oak (Quercus rubra, Q. velutina, Q. falcata) in the NC piedmont. For simulated test data sets, the model successfully recovered the simulated dispersal parameters and pedigrees. Pollen dispersal in the example population was extensive, with an average father-mother distance of 178 m. Estimated seed dispersal distances at the piedmont site were substantially longer than previous estimates based on seed-trap data (average 128 m vs. 9.3 m), suggesting that, under some circumstances, oaks may be less dispersal-limited than is commonly thought, with a greater potential for range shifts in response to climate change.

  9. The pharmacokinetics of dexmedetomidine during long-term infusion in critically ill pediatric patients. A Bayesian approach with informative priors.

    Science.gov (United States)

    Wiczling, Paweł; Bartkowska-Śniatkowska, Alicja; Szerkus, Oliwia; Siluk, Danuta; Rosada-Kurasińska, Jowita; Warzybok, Justyna; Borsuk, Agnieszka; Kaliszan, Roman; Grześkowiak, Edmund; Bienert, Agnieszka

    2016-06-01

    The purpose of this study was to assess the pharmacokinetics of dexmedetomidine in the ICU settings during the prolonged infusion and to compare it with the existing literature data using the Bayesian population modeling with literature-based informative priors. Thirty-eight patients were included in the analysis with concentration measurements obtained at two occasions: first from 0 to 24 h after infusion initiation and second from 0 to 8 h after infusion end. Data analysis was conducted using WinBUGS software. The prior information on dexmedetomidine pharmacokinetics was elicited from the literature study pooling results from a relatively large group of 95 children. A two compartment PK model, with allometrically scaled parameters, maturation of clearance and t-student residual distribution on a log-scale was used to describe the data. The incorporation of time-dependent (different between two occasions) PK parameters improved the model. It was observed that volume of distribution is 1.5-fold higher during the second occasion. There was also an evidence of increased (1.3-fold) clearance for the second occasion with posterior probability equal to 62 %. This work demonstrated the usefulness of Bayesian modeling with informative priors in analyzing pharmacokinetic data and comparing it with existing literature knowledge.

  10. An ecosystem service approach to support integrated pond management: a case study using Bayesian belief networks--highlighting opportunities and risks.

    Science.gov (United States)

    Landuyt, Dries; Lemmens, Pieter; D'hondt, Rob; Broekx, Steven; Liekens, Inge; De Bie, Tom; Declerck, Steven A J; De Meester, Luc; Goethals, Peter L M

    2014-12-01

    Freshwater ponds deliver a broad range of ecosystem services (ESS). Taking into account this broad range of services to attain cost-effective ESS delivery is an important challenge facing integrated pond management. To assess the strengths and weaknesses of an ESS approach to support decisions in integrated pond management, we applied it on a small case study in Flanders, Belgium. A Bayesian belief network model was developed to assess ESS delivery under three alternative pond management scenarios: intensive fish farming (IFF), extensive fish farming (EFF) and nature conservation management (NCM). A probabilistic cost-benefit analysis was performed that includes both costs associated with pond management practices and benefits associated with ESS delivery. Whether or not a particular ESS is included in the analysis affects the identification of the most preferable management scenario by the model. Assessing the delivery of a more complete set of ecosystem services tends to shift the results away from intensive management to more biodiversity-oriented management scenarios. The proposed methodology illustrates the potential of Bayesian belief networks. BBNs facilitate knowledge integration and their modular nature encourages future model expansion to more encompassing sets of services. Yet, we also illustrate the key weaknesses of such exercises, being that the choice whether or not to include a particular ecosystem service may determine the suggested optimal management practice.

  11. Bayesian approach in MN low dose of radiation counting; Estimacion bayesiana en el contaje de micronucleos a bajas dosis de radiacion

    Energy Technology Data Exchange (ETDEWEB)

    Serna Berna, A.; Alcaraz, M.; Acevedo, C.; Navarro, J. L.; Alcanzar, M. D.; Canteras, M.

    2006-07-01

    The Micronucleus assay in lymphocytes is a well established technique for the assessment of genetic damage induced by ionizing radiation. Due to the presence of a natural background of MN the net MN is obtained by subtracting this value to the gross value. When very low doses of radiation are given the induced MN is close even lower than the predetermined background value. Furthermore, the damage distribution induced by the radiation follows a Poisson probability distribution. These two facts pose a difficult task to obtain the net counting rate in the exposed situations. It is possible to overcome this problem using a bayesian approach, in which the selection of a priori distributions for the background and net counting rate plays an important role. In the present work we make a detailed analysed using bayesian theory to infer the net counting rate in two different situations: a) when the background is known for an individual sample, using exact value value for the background and Jeffreys prior for the net counting rate, and b) when the background is not known and we make use of a population background distribution as background prior function and constant prior for the net counting rate. (Author)

  12. Time trends of esophageal cancer mortality in Linzhou city during the period 1988-2010 and a Bayesian approach projection for 2020.

    Science.gov (United States)

    Liu, Shu-Zheng; Zhang, Fang; Quan, Pei-Liang; Lu, Jian-Bang; Liu, Zhi-Cai; Sun, Xi-Bin

    2012-01-01

    In recent decades, decreasing trends in esophageal cancer mortality have been observed across China. We here describe esophageal cancer mortality trends in Linzhou city, a high-incidence region of esophageal cancer in China, during 1988-2010 and make a esophageal cancer mortality projection in the period 2011-2020 using a Bayesian approach. Age standardized mortality rates were estimated by direct standardization to the World population structure in 1985. A Bayesian age-period-cohort (BAPC) analysis was carried out in order to investigate the effect of the age, period and birth cohort on esophageal cancer mortality in Linzhou during 1988-2010 and to estimate future trends for the period 2011-2020. Age-adjusted rates for men and women decreased from 1988 to 2005 and changed little thereafter. Risk increased from 30 years of age until the very elderly. Period effects showed little variation in risk throughout 1988-2010. In contrast, a cohort effect showed risk decreased greatly in later cohorts. Forecasting, based on BAPC modeling, resulted in a increasing burden of mortality and a decreasing age standardized mortality rate of esophageal cancer in Linzhou city. The decrease of esophageal cancer mortality risk since the 1930 cohort could be attributable to the improvements of social- economic environment and lifestyle. The standardized mortality rates of esophageal cancer should decrease continually. The effect of aging on the population could explain the increase in esophageal mortality projected for 2020.

  13. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  14. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...

  15. A Bayesian approach for evaluation of the effect of water quality model parameter uncertainty on TMDLs: A case study of Miyun Reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Shidong, E-mail: emblembl@sina.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Jia, Haifeng, E-mail: jhf@tsinghua.edu.cn [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Xu, Changqing, E-mail: 2008changqing@163.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Xu, Te, E-mail: xt_lichking@qq.com [School of Environment, Tsinghua University, 1 Qinghuayuan, Haidian District, Beijing 100084 (China); Melching, Charles, E-mail: steve.melching17@gmail.com [Melching Water Solutions, 4030 W. Edgerton Avenue, Greenfield, WI 53221 (United States)

    2016-08-01

    Facing increasingly serious water pollution, the Chinese government is changing the environmental management strategy from solely pollutant concentration control to a Total Maximum Daily Load (TMDL) program, and water quality models are increasingly being applied to determine the allowable pollutant load in the TMDL. Despite the frequent use of models, few studies have focused on how parameter uncertainty in water quality models affect the allowable pollutant loads in the TMDL program, particularly for complicated and high-dimension water quality models. Uncertainty analysis for such models is limited by time-consuming simulation and high-dimensionality and nonlinearity in parameter spaces. In this study, an allowable pollutant load calculation platform was established using the Environmental Fluid Dynamics Code (EFDC), which is a widely applied hydrodynamic-water quality model. A Bayesian approach, i.e. the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, which is a high-efficiency, multi-chain Markov Chain Monte Carlo (MCMC) method, was applied to assess the effects of parameter uncertainty on the water quality model simulations and its influence on the allowable pollutant load calculation in the TMDL program. Miyun Reservoir, which is the most important surface drinking water source for Beijing, suffers from eutrophication and was selected as a case study. The relations between pollutant loads and water quality indicators are obtained through a graphical method in the simulation platform. Ranges of allowable pollutant loads were obtained according to the results of parameter uncertainty analysis, i.e. Total Organic Carbon (TOC): 581.5–1030.6 t·yr{sup −1}; Total Phosphorus (TP): 23.3–31.0 t·yr{sup −1}; and Total Nitrogen (TN): 480–1918.0 t·yr{sup −1}. The wide ranges of allowable pollutant loads reveal the importance of parameter uncertainty analysis in a TMDL program for allowable pollutant load calculation and margin of safety (MOS

  16. Maximum margin Bayesian network classifiers.

    Science.gov (United States)

    Pernkopf, Franz; Wohlmayr, Michael; Tschiatschek, Sebastian

    2012-03-01

    We present a maximum margin parameter learning algorithm for Bayesian network classifiers using a conjugate gradient (CG) method for optimization. In contrast to previous approaches, we maintain the normalization constraints on the parameters of the Bayesian network during optimization, i.e., the probabilistic interpretation of the model is not lost. This enables us to handle missing features in discriminatively optimized Bayesian networks. In experiments, we compare the classification performance of maximum margin parameter learning to conditional likelihood and maximum likelihood learning approaches. Discriminative parameter learning significantly outperforms generative maximum likelihood estimation for naive Bayes and tree augmented naive Bayes structures on all considered data sets. Furthermore, maximizing the margin dominates the conditional likelihood approach in terms of classification performance in most cases. We provide results for a recently proposed maximum margin optimization approach based on convex relaxation. While the classification results are highly similar, our CG-based optimization is computationally up to orders of magnitude faster. Margin-optimized Bayesian network classifiers achieve classification performance comparable to support vector machines (SVMs) using fewer parameters. Moreover, we show that unanticipated missing feature values during classification can be easily processed by discriminatively optimized Bayesian network classifiers, a case where discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.

  17. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  18. Bayesian Approach to Effective Model of NiGa2S4 Triangular Lattice with Boltzmann Factor

    Science.gov (United States)

    Takenaka, Hikaru; Nagata, Kenji; Mizokawa, Takashi; Okada, Masato

    2016-12-01

    We propose a method for inducting the Boltzmann factor to extract effective classical spin Hamiltonians from mean-field-type electronic structural calculations by means of the Bayesian inference. This method enables us to compare electronic structural calculations with experiments according to the classical model at a finite temperature. Application of this method to the unrestricted Hartree-Fock calculations for NiGa2S4 led to the estimation that the superexchange interaction between the nearest neighbor sites is ferromagnetic at low temperature, which is consistent with magnetic experiment results. This supports the theory that competition between the antiferromagnetic third neighbor interaction and ferromagnetic nearest neighbor interaction may lead to the quantum spin liquid in NiGa2S4.

  19. Linking urbanization to the Biological Condition Gradient (BCG) for stream ecosystems in the Northeastern United States using a Bayesian network approach

    Science.gov (United States)

    Kashuba, Roxolana; McMahon, Gerard; Cuffney, Thomas F.; Qian, Song; Reckhow, Kenneth; Gerritsen, Jeroen; Davies, Susan

    2012-01-01

    Urban development alters important physical, chemical, and biological processes that define urban stream ecosystems. An approach was developed for quantifying the effects of these processes on aquatic biota, and then linking those effects to endpoints that can be used for environmental management. These complex, interacting systems are challenging to model from a scientific standpoint. A desirable model clearly shows the system, simulates the interactions, and ultimately predicts results of management actions. Traditional regression techniques that calculate empirical relations between pairs of environmental factors do not capture the interconnected web of multiple stressors, but urban development effects are not yet understood at the detailed scales required to make mechanistic modeling approaches feasible. Therefore, in contrast to a fully deterministic or fully statistical modeling approach, a Bayesian network model provides a hybrid approach that can be used to represent known general associations between variables while acknowledging uncertainty in predicted outcomes. It does so by quantifying an expert-elicited network of probabilistic relations between variables. Advantages of this modeling approach include (1) flexibility in accommodating many model specifications and information types; (2) efficiency in storing and manipulating complex information, and to parameterize; and (3) transparency in describing the relations using nodes and arrows and in describing uncertainties with discrete probability distributions for each variable.

  20. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.

  1. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  2. An Approach for R&D Partner Selection in Alliances between Large Companies, and Small and Medium Enterprises (SMEs: Application of Bayesian Network and Patent Analysis

    Directory of Open Access Journals (Sweden)

    Keeeun Lee

    2016-01-01

    Full Text Available The enhanced R&D cooperative efforts between large firms and small and medium-sized enterprises (SMEs have been emphasized to perform innovation projects and succeed in deploying profitable businesses. In order to promote such win-win alliances, it is necessary to consider the capabilities of large firms and SMEs, respectively. Thus, this paper proposes a new approach of partner selection when a large firm assesses SMEs as potential candidates for R&D collaboration. The first step of the suggested approach is to define the necessary technology for a firm by referring to a structured technology roadmap, which is a useful technique in the partner selection from the perspectives of a large firm. Second, a list of appropriate SME candidates is generated by patent information. Finally, a Bayesian network model is formulated to select an SME as an R&D collaboration partner which fits in the industry and the large firm by utilizing a bibliography with United States patents. This paper applies the proposed approach to the semiconductor industry and selects potential R&D partners for a large firm. This paper will explain how to use the model as a systematic and analytic approach for creating effective partnerships between large firms and SMEs.

  3. Operational modal analysis of a high-rise multi-function building with dampers by a Bayesian approach

    Science.gov (United States)

    Ni, Yanchun; Lu, Xilin; Lu, Wensheng

    2017-03-01

    The field non-destructive vibration test plays an important role in the area of structural health monitoring. It assists in monitoring the health status and reducing the risk caused by the poor performance of structures. As the most economic field test among the various vibration tests, the ambient vibration test is the most popular and is widely used to assess the physical condition of a structure under operational service. Based on the ambient vibration data, modal identification can help provide significant previous study for model updating and damage detection during the service life of a structure. It has been proved that modal identification works well in the investigation of the dynamic performance of different kinds of structures. In this paper, the objective structure is a high-rise multi-function office building. The whole building is composed of seven three-story structural units. Each unit comprises one complete floor and two L shaped floors to form large spaces along the vertical direction. There are 56 viscous dampers installed in the building to improve the energy dissipation capacity. Due to the special feature of the structure, field vibration tests and further modal identification were performed to investigate its dynamic performance. Twenty-nine setups were designed to cover all the degrees of freedom of interest. About two years later, another field test was carried out to measure the building for 48 h to investigate the performance variance and the distribution of the modal parameters. A Fast Bayesian FFT method was employed to perform the modal identification. This Bayesian method not only provides the most probable values of the modal parameters but also assesses the associated posterior uncertainty analytically, which is especially relevant in field vibration tests arising due to measurement noise, sensor alignment error, modelling error, etc. A shaking table test was also implemented including cases with and without dampers, which assists

  4. Bayesian Just-So Stories in Psychology and Neuroscience

    Science.gov (United States)

    Bowers, Jeffrey S.; Davis, Colin J.

    2012-01-01

    According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…

  5. Prior approval: the growth of Bayesian methods in psychology.

    Science.gov (United States)

    Andrews, Mark; Baguley, Thom

    2013-02-01

    Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.

  6. Estimation of canine Leishmania infection prevalence in six cities of the Algerian littoral zone using a Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Amel Adel

    Full Text Available A large-scale study on canine Leishmania infection (CanL was conducted in six localities along a west-east transect in the Algerian littoral zone (Tlemcen, Mostaganem, Tipaza, Boumerdes, Bejaia, Jijel and covering two sampling periods. In total 2,184 dogs were tested with an indirect fluorescent antibody test (IFAT and a direct agglutination test (DAT. Combined multiple-testing and several statistical methods were compared to estimate the CanL true prevalence and tests characteristics (sensitivity and specificity. The Bayesian full model showed the best fit and yielded prevalence estimates between 11% (Mostaganem, first period and 38% (Bejaia, second period. Sensitivity of IFAT varied (in function of locality between 86% and 88% while its specificity varied between 65% and 87%. DAT was less sensitive than IFAT but showed a higher specificity (between 80% and 95% in function of locality or/and season. A general increasing trend of the CanL prevalence was noted from west to east. A concordance between the present results and the incidence of human cases of visceral leishmaniasis was observed, where also a maximum was recorded for Bejaia. The results of the present study highlight the dangers when using IFAT as a gold standard.

  7. Conditions for the adoption of conservation agriculture in Central Morocco: an approach based on Bayesian network modelling

    Directory of Open Access Journals (Sweden)

    Laura Bonzanigo

    2016-03-01

    Full Text Available Research in Central Morocco, proves that conservation agriculture increases yields, reduces labour requirements, and erosion, and improves soil fertility. However, after nearly two decades of demonstration and advocacy, adoption is still limited. This paper investigates the critical constraints and potential opportunities for the adoption of conservation agriculture for different typologies of farms. We measured the possible pathways of adoption via a Bayesian decision network (BDN. BDNs allow the inclusion of stakeholders’ knowledge where data is scant, whilst at the same time they are supported by a robust mathematical background. We first developed a conceptual map of the elements affecting the decision about tillage, which we refined in a workshop with farmers and researchers from the Settat area. We then involved experts in the elicitation of conditional probabilities tables, to quantify the cascade of causal links that determine (or not the adoption. Via BDNs, we could categorise under which specific technical and socio-economic conditions no tillage agriculture is best suited to which farmers. We, by identifying the main constraints and running sensitivity analyses, were able to convey clear messages on how policy- makers may facilitate the conversion. As new evidence is collected, the BDN can be updated to obtain evidence more targeted and fine tuned to the adoption contexts.

  8. The effect of road network patterns on pedestrian safety: A zone-based Bayesian spatial modeling approach.

    Science.gov (United States)

    Guo, Qiang; Xu, Pengpeng; Pei, Xin; Wong, S C; Yao, Danya

    2017-02-01

    Pedestrian safety is increasingly recognized as a major public health concern. Extensive safety studies have been conducted to examine the influence of multiple variables on the occurrence of pedestrian-vehicle crashes. However, the explicit relationship between pedestrian safety and road network characteristics remains unknown. This study particularly focused on the role of different road network patterns on the occurrence of crashes involving pedestrians. A global integration index via space syntax was introduced to quantify the topological structures of road networks. The Bayesian Poisson-lognormal (PLN) models with conditional autoregressive (CAR) prior were then developed via three different proximity structures: contiguity, geometry-centroid distance, and road network connectivity. The models were also compared with the PLN counterpart without spatial correlation effects. The analysis was based on a comprehensive crash dataset from 131 selected traffic analysis zones in Hong Kong. The results indicated that higher global integration was associated with more pedestrian-vehicle crashes; the irregular pattern network was proved to be safest in terms of pedestrian crash occurrences, whereas the grid pattern was the least safe; the CAR model with a neighborhood structure based on road network connectivity was found to outperform in model goodness-of-fit, implying the importance of accurately accounting for spatial correlation when modeling spatially aggregated crash data.

  9. Estimating Lifetime Costs of Social Care: A Bayesian Approach Using Linked Administrative Datasets from Three Geographical Areas.

    Science.gov (United States)

    Steventon, Adam; Roberts, Adam

    2015-12-01

    We estimated lifetime costs of publicly funded social care, covering services such as residential and nursing care homes, domiciliary care and meals. Like previous studies, we constructed microsimulation models. However, our transition probabilities were estimated from longitudinal, linked administrative health and social care datasets, rather than from survey data. Administrative data were obtained from three geographical areas of England, and we estimated transition probabilities in each of these sites flexibly using Bayesian methods. This allowed us to quantify regional variation as well as the impact of structural and parameter uncertainty regarding the transition probabilities. Expected lifetime costs at age 65 were £20,200-27,000 for men and £38,700-49,000 for women, depending on which of the three areas was used to calibrate the model. Thus, patterns of social care spending differed markedly between areas, with mean costs varying by almost £10,000 (25%) across the lifetime for people of the same age and gender. Allowing for structural and parameter uncertainty had little impact on expected lifetime costs, but slightly increased the risk of very high costs, which will have implications for insurance products for social care through increasing requirements for capital reserves.

  10. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier-Stokes simulations: A data-driven, physics-informed Bayesian approach

    Science.gov (United States)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  11. Spatiotemporal Modeling of Ozone Levels in Quebec (Canada): A Comparison of Kriging, Land-Use Regression (LUR), and Combined Bayesian Maximum Entropy–LUR Approaches

    Science.gov (United States)

    Adam-Poupart, Ariane; Brand, Allan; Fournier, Michel; Jerrett, Michael

    2014-01-01

    Background: Ambient air ozone (O3) is a pulmonary irritant that has been associated with respiratory health effects including increased lung inflammation and permeability, airway hyperreactivity, respiratory symptoms, and decreased lung function. Estimation of O3 exposure is a complex task because the pollutant exhibits complex spatiotemporal patterns. To refine the quality of exposure estimation, various spatiotemporal methods have been developed worldwide. Objectives: We sought to compare the accuracy of three spatiotemporal models to predict summer ground-level O3 in Quebec, Canada. Methods: We developed a land-use mixed-effects regression (LUR) model based on readily available data (air quality and meteorological monitoring data, road networks information, latitude), a Bayesian maximum entropy (BME) model incorporating both O3 monitoring station data and the land-use mixed model outputs (BME-LUR), and a kriging method model based only on available O3 monitoring station data (BME kriging). We performed leave-one-station-out cross-validation and visually assessed the predictive capability of each model by examining the mean temporal and spatial distributions of the average estimated errors. Results: The BME-LUR was the best predictive model (R2 = 0.653) with the lowest root mean-square error (RMSE ;7.06 ppb), followed by the LUR model (R2 = 0.466, RMSE = 8.747) and the BME kriging model (R2 = 0.414, RMSE = 9.164). Conclusions: Our findings suggest that errors of estimation in the interpolation of O3 concentrations with BME can be greatly reduced by incorporating outputs from a LUR model developed with readily available data. Citation: Adam-Poupart A, Brand A, Fournier M, Jerrett M, Smargiassi A. 2014. Spatiotemporal modeling of ozone levels in Quebec (Canada): a comparison of kriging, land-use regression (LUR), and combined Bayesian maximum entropy–LUR approaches. Environ Health Perspect 122:970–976; http://dx.doi.org/10.1289/ehp.1306566 PMID:24879650

  12. A species-level phylogeny of all extant and late Quaternary extinct mammals using a novel heuristic-hierarchical Bayesian approach.

    Science.gov (United States)

    Faurby, Søren; Svenning, Jens-Christian

    2015-03-01

    Across large clades, two problems are generally encountered in the estimation of species-level phylogenies: (a) the number of taxa involved is generally so high that computation-intensive approaches cannot readily be utilized and (b) even for clades that have received intense study (e.g., mammals), attention has been centered on relatively few selected species, and most taxa must therefore be positioned on the basis of very limited genetic data. Here, we describe a new heuristic-hierarchical Bayesian approach and use it to construct a species-level phylogeny for all extant and late Quaternary extinct mammals. In this approach, species with large quantities of genetic data are placed nearly freely in the mammalian phylogeny according to these data, whereas the placement of species with lower quantities of data is performed with steadily stricter restrictions for decreasing data quantities. The advantages of the proposed method include (a) an improved ability to incorporate phylogenetic uncertainty in downstream analyses based on the resulting phylogeny, (b) a reduced potential for long-branch attraction or other types of errors that place low-data taxa far from their true position, while maintaining minimal restrictions for better-studied taxa, and (c) likely improved placement of low-data taxa due to the use of closer outgroups.

  13. Variational bayesian method of estimating variance components.

    Science.gov (United States)

    Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi

    2016-07-01

    We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.

  14. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  15. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  16. The Importance of Isomorphism for Conclusions about Homology: A Bayesian Multilevel Structural Equation Modeling Approach with Ordinal Indicators.

    Science.gov (United States)

    Guenole, Nigel

    2016-01-01

    We describe a Monte Carlo study examining the impact of assuming item isomorphism (i.e., equivalent construct meaning across levels of analysis) on conclusions about homology (i.e., equivalent structural relations across levels of analysis) under varying degrees of non-isomorphism in the context of ordinal indicator multilevel structural equation models (MSEMs). We focus on the condition where one or more loadings are higher on the between level than on the within level to show that while much past research on homology has ignored the issue of psychometric isomorphism, psychometric isomorphism is in fact critical to valid conclusions about homology. More specifically, when a measurement model with non-isomorphic items occupies an exogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the within level exogenous latent variance is under-estimated leading to over-estimation of the within level structural coefficient, while the between level exogenous latent variance is overestimated leading to underestimation of the between structural coefficient. When a measurement model with non-isomorphic items occupies an endogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the endogenous within level latent variance is under-estimated leading to under-estimation of the within level structural coefficient while the endogenous between level latent variance is over-estimated leading to over-estimation of the between level structural coefficient. The innovative aspect of this article is demonstrating that even minor violations of psychometric isomorphism render claims of homology untenable. We also show that posterior predictive p-values for ordinal indicator Bayesian MSEMs are insensitive to violations of isomorphism even when they lead to severely biased within and between level structural parameters. We highlight conditions where poor estimation of even correctly specified

  17. Gaining insight into regional coastal changes on La Réunion island through a Bayesian data mining approach

    Science.gov (United States)

    Bulteau, T.; Baills, A.; Petitjean, L.; Garcin, M.; Palanisamy, H.; Le Cozannet, G.

    2015-01-01

    Recent works have highlighted the interest in coastal geographical databases - collected for coastal management purposes - for obtaining insight into current shoreline changes. On La Réunion, a tropical volcanic high island located in the Southern Indian Ocean, a dataset is available which describes shoreline changes, the coastal geomorphology and the presence of anthropic structures. This database is first supplemented with information on the exposure of each coastal segment to energetic waves and to estuarine sediment inputs. To incorporate relative sea-level changes along the coast in the database, levelling data are analysed in combination with GPS, satellite altimetry and sea-level reconstructions. Finally, a method based on Bayesian networks is used to assess the probabilistic relationships between the variables in the database. The results highlight the high degree of dependency between variables: a retrospective model is able to reproduce 81% of the observations of shoreline mobility. Importantly, we report coastal ground motions for La Réunion island of the order of 1 to 2 mm/year along the coast. However, the resulting differing rates of relative sea-level rise do not significantly impact on shoreline changes. Instead, the results suggest a major control of geological processes and local coastal geomorphic settings on shoreline evolution. While any exploration of a coastal database needs to be complemented with human reasoning to interpret the results in terms of physical processes, this study highlights the significance of revisiting other datasets to gain insight into coastal processes and factors causing shoreline changes, including sea-level changes.

  18. The importance of isomorphism for conclusions about homology: A Bayesian multilevel structural equation modeling approach with ordinal indicators

    Directory of Open Access Journals (Sweden)

    Nigel eGuenole

    2016-03-01

    Full Text Available We describe a Monte Carlo study examining the impact of assuming item isomorphism (i.e., equivalent construct meaning across levels of analysis on conclusions about homology (i.e., equivalent structural relations across levels of analysis under varying degrees of non-isomorphism in the context of ordinal indicator multilevel structural equation models (MSEMs. We focus on the condition where one or more loadings are higher on the between level than on the within level to show that while much past research on homology has ignored the issue of psychometric isomorphism, psychometric isomorphism is in fact critical to valid conclusions about homology. More specifically, when a measurement model with non-isomorphic items occupies an exogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the within level exogenous latent variance is under-estimated leading to over-estimation of the within level structural coefficient, while the between level exogenous latent variance is overestimated leading to underestimation of the between structural coefficient. When a measurement model with non-isomorphic items occupies an endogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the endogenous within level latent variance is under-estimated leading to under-estimation of the within level structural coefficient while the endogenous between level latent variance is over-estimated leading to over-estimation of the between level structural coefficient. The innovative aspect of this article is demonstrating that even minor violations of psychometric isomorphism render claims of homology untenable. We also show that posterior predictive p-values for ordinal indicator Bayesian MSEMs are insensitive to violations of isomorphism even when they lead to severely biased within and between level structural parameters. We highlight conditions where poor estimation of even

  19. A Bayesian approach to study the risk variables for tuberculosis occurrence in domestic and wild ungulates in South Central Spain

    Directory of Open Access Journals (Sweden)

    Rodríguez-Prieto Víctor

    2012-08-01

    Full Text Available Abstract Background Bovine tuberculosis (bTB is a chronic infectious disease mainly caused by Mycobacterium bovis. Although eradication is a priority for the European authorities, bTB remains active or even increasing in many countries, causing significant economic losses. The integral consideration of epidemiological factors is crucial to more cost-effectively allocate control measures. The aim of this study was to identify the nature and extent of the association between TB distribution and a list of potential risk factors regarding cattle, wild ungulates and environmental aspects in Ciudad Real, a Spanish province with one of the highest TB herd prevalences. Results We used a Bayesian mixed effects multivariable logistic regression model to predict TB occurrence in either domestic or wild mammals per municipality in 2007 by using information from the previous year. The municipal TB distribution and endemicity was clustered in the western part of the region and clearly overlapped with the explanatory variables identified in the final model: (1 incident cattle farms, (2 number of years of veterinary inspection of big game hunting events, (3 prevalence in wild boar, (4 number of sampled cattle, (5 persistent bTB-infected cattle farms, (6 prevalence in red deer, (7 proportion of beef farms, and (8 farms devoted to bullfighting cattle. Conclusions The combination of these eight variables in the final model highlights the importance of the persistence of the infection in the hosts, surveillance efforts and some cattle management choices in the circulation of M. bovis in the region. The spatial distribution of these variables, together with particular Mediterranean features that favour the wildlife-livestock interface may explain the M. bovis persistence in this region. Sanitary authorities should allocate efforts towards specific areas and epidemiological situations where the wildlife-livestock interface seems to critically hamper the definitive b

  20. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  1. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  2. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  3. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  4. Application of multi-SNP approaches Bayesian LASSO and AUC-RF to detect main effects of inflammatory-gene variants associated with bladder cancer risk.

    Directory of Open Access Journals (Sweden)

    Evangelina López de Maturana

    Full Text Available The relationship between inflammation and cancer is well established in several tumor types, including bladder cancer. We performed an association study between 886 inflammatory-gene variants and bladder cancer risk in 1,047 cases and 988 controls from the Spanish Bladder Cancer (SBC/EPICURO Study. A preliminary exploration with the widely used univariate logistic regression approach did not identify any significant SNP after correcting for multiple testing. We further applied two more comprehensive methods to capture the complexity of bladder cancer genetic susceptibility: Bayesian Threshold LASSO (BTL, a regularized regression method, and AUC-Random Forest, a machine-learning algorithm. Both approaches explore the joint effect of markers. BTL analysis identified a signature of 37 SNPs in 34 genes showing an association with bladder cancer. AUC-RF detected an optimal predictive subset of 56 SNPs. 13 SNPs were identified by both methods in the total population. Using resources from the Texas Bladder Cancer study we were able to replicate 30% of the SNPs assessed. The associations between inflammatory SNPs and bladder cancer were reexamined among non-smokers to eliminate the effect of tobacco, one of the strongest and most prevalent environmental risk factor for this tumor. A 9 SNP-signature was detected by BTL. Here we report, for the first time, a set of SNP in inflammatory genes jointly associated with bladder cancer risk. These results highlight the importance of the complex structure of genetic susceptibility associated with cancer risk.

  5. A Flexible Bayesian Approach to Monotone Missing Data in Longitudinal Studies with Nonignorable Missingness with Application to an Acute Schizophrenia Clinical Trial.

    Science.gov (United States)

    Linero, Antonio R; Daniels, Michael J

    2015-03-01

    We develop a Bayesian nonparametric model for a longitudinal response in the presence of nonignorable missing data. Our general approach is to first specify a working model that flexibly models the missingness and full outcome processes jointly. We specify a Dirichlet process mixture of missing at random (MAR) models as a prior on the joint distribution of the working model. This aspect of the model governs the fit of the observed data by modeling the observed data distribution as the marginalization over the missing data in the working model. We then separately specify the conditional distribution of the missing data given the observed data and dropout. This approach allows us to identify the distribution of the missing data using identifying restrictions as a starting point. We propose a framework for introducing sensitivity parameters, allowing us to vary the untestable assumptions about the missing data mechanism smoothly. Informative priors on the space of missing data assumptions can be specified to combine inferences under many different assumptions into a final inference and accurately characterize uncertainty. These methods are motivated by, and applied to, data from a clinical trial assessing the efficacy of a new treatment for acute Schizophrenia.

  6. Analysis of transtheoretical model of health behavioral changes in a nutrition intervention study--a continuous time Markov chain model with Bayesian approach.

    Science.gov (United States)

    Ma, Junsheng; Chan, Wenyaw; Tsai, Chu-Lin; Xiong, Momiao; Tilley, Barbara C

    2015-11-30

    Continuous time Markov chain (CTMC) models are often used to study the progression of chronic diseases in medical research but rarely applied to studies of the process of behavioral change. In studies of interventions to modify behaviors, a widely used psychosocial model is based on the transtheoretical model that often has more than three states (representing stages of change) and conceptually permits all possible instantaneous transitions. Very little attention is given to the study of the relationships between a CTMC model and associated covariates under the framework of transtheoretical model. We developed a Bayesian approach to evaluate the covariate effects on a CTMC model through a log-linear regression link. A simulation study of this approach showed that model parameters were accurately and precisely estimated. We analyzed an existing data set on stages of change in dietary intake from the Next Step Trial using the proposed method and the generalized multinomial logit model. We found that the generalized multinomial logit model was not suitable for these data because it ignores the unbalanced data structure and temporal correlation between successive measurements. Our analysis not only confirms that the nutrition intervention was effective but also provides information on how the intervention affected the transitions among the stages of change. We found that, compared with the control group, subjects in the intervention group, on average, spent substantively less time in the precontemplation stage and were more/less likely to move from an unhealthy/healthy state to a healthy/unhealthy state.

  7. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    Directory of Open Access Journals (Sweden)

    Julie Vercelloni

    Full Text Available Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  8. Performance of the 'material Failure Forecast Method' in real-time situations: A Bayesian approach applied on effusive and explosive eruptions

    Science.gov (United States)

    Boué, A.; Lesage, P.; Cortés, G.; Valette, B.; Reyes-Dávila, G.; Arámbula-Mendoza, R.; Budi-Santoso, A.

    2016-11-01

    Most attempts of deterministic eruption forecasting are based on the material Failure Forecast Method (FFM). This method assumes that a precursory observable, such as the rate of seismic activity, can be described by a simple power law which presents a singularity at a time close to the eruption onset. Until now, this method has been applied only in a small number of cases, generally for forecasts in hindsight. In this paper, a rigorous Bayesian approach of the FFM designed for real-time applications is applied. Using an automatic recognition system, seismo-volcanic events are detected and classified according to their physical mechanism and time series of probability distributions of the rates of events are calculated. At each time of observation, a Bayesian inversion provides estimations of the exponent of the power law and of the time of eruption, together with their probability density functions. Two criteria are defined in order to evaluate the quality and reliability of the forecasts. Our automated procedure has allowed the analysis of long, continuous seismic time series: 13 years from Volcán de Colima, Mexico, 10 years from Piton de la Fournaise, Reunion Island, France, and several months from Merapi volcano, Java, Indonesia. The new forecasting approach has been applied to 64 pre-eruptive sequences which present various types of dominant seismic activity (volcano-tectonic or long-period events) and patterns of seismicity with different level of complexity. This has allowed us to test the FFM assumptions, to determine in which conditions the method can be applied, and to quantify the success rate of the forecasts. 62% of the precursory sequences analysed are suitable for the application of FFM and half of the total number of eruptions are successfully forecast in hindsight. In real-time, the method allows for the successful forecast of 36% of all the eruptions considered. Nevertheless, real-time forecasts are successful for 83% of the cases that fulfil the

  9. Small sample Bayesian analyses in assessment of weapon performance

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Abundant test data are required in assessment of weapon performance.When weapon test data are insufficient,Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations.The several Bayesian approaches are discussed and some limitations are founded.An improvement is put forward after limitations of Bayesian approaches available are analyzed and t he improved approach is applied to assessment of some new weapon performance.

  10. Applied Bayesian Hierarchical Methods

    CERN Document Server

    Congdon, Peter D

    2010-01-01

    Bayesian methods facilitate the analysis of complex models and data structures. Emphasizing data applications, alternative modeling specifications, and computer implementation, this book provides a practical overview of methods for Bayesian analysis of hierarchical models.

  11. Bayesian missing data problems EM, data augmentation and noniterative computation

    CERN Document Server

    Tan, Ming T; Ng, Kai Wang

    2009-01-01

    Bayesian Missing Data Problems: EM, Data Augmentation and Noniterative Computation presents solutions to missing data problems through explicit or noniterative sampling calculation of Bayesian posteriors. The methods are based on the inverse Bayes formulae discovered by one of the author in 1995. Applying the Bayesian approach to important real-world problems, the authors focus on exact numerical solutions, a conditional sampling approach via data augmentation, and a noniterative sampling approach via EM-type algorithms. After introducing the missing data problems, Bayesian approach, and poste

  12. Bayesian structural equation modeling in sport and exercise psychology.

    Science.gov (United States)

    Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus

    2015-08-01

    Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach.

  13. Relative accuracy of spatial predictive models for lynx Lynx canadensis derived using logistic regression-AIC, multiple criteria evaluation and Bayesian approaches

    Directory of Open Access Journals (Sweden)

    Shelley M. ALEXANDER

    2009-02-01

    Full Text Available We compared probability surfaces derived using one set of environmental variables in three Geographic Information Systems (GIS-based approaches: logistic regression and Akaike’s Information Criterion (AIC, Multiple Criteria Evaluation (MCE, and Bayesian Analysis (specifically Dempster-Shafer theory. We used lynx Lynx canadensis as our focal species, and developed our environment relationship model using track data collected in Banff National Park, Alberta, Canada, during winters from 1997 to 2000. The accuracy of the three spatial models were compared using a contingency table method. We determined the percentage of cases in which both presence and absence points were correctly classified (overall accuracy, the failure to predict a species where it occurred (omission error and the prediction of presence where there was absence (commission error. Our overall accuracy showed the logistic regression approach was the most accurate (74.51%. The multiple criteria evaluation was intermediate (39.22%, while the Dempster-Shafer (D-S theory model was the poorest (29.90%. However, omission and commission error tell us a different story: logistic regression had the lowest commission error, while D-S theory produced the lowest omission error. Our results provide evidence that habitat modellers should evaluate all three error measures when ascribing confidence in their model. We suggest that for our study area at least, the logistic regression model is optimal. However, where sample size is small or the species is very rare, it may also be useful to explore and/or use a more ecologically cautious modelling approach (e.g. Dempster-Shafer that would over-predict, protect more sites, and thereby minimize the risk of missing critical habitat in conservation plans[Current Zoology 55(1: 28 – 40, 2009].

  14. A Bayesian approach to quantifying the effects of mass poultry vaccination upon the spatial and temporal dynamics of H5N1 in Northern Vietnam.

    Directory of Open Access Journals (Sweden)

    Patrick G T Walker

    2010-02-01

    Full Text Available Outbreaks of H5N1 in poultry in Vietnam continue to threaten the livelihoods of those reliant on poultry production whilst simultaneously posing a severe public health risk given the high mortality associated with human infection. Authorities have invested significant resources in order to control these outbreaks. Of particular interest is the decision, following a second wave of outbreaks, to move from a "stamping out" approach to the implementation of a nationwide mass vaccination campaign. Outbreaks which occurred around this shift in policy provide a unique opportunity to evaluate the relative effectiveness of these approaches and to help other countries make informed judgements when developing control strategies. Here we use Bayesian Markov Chain Monte Carlo (MCMC data augmentation techniques to derive the first quantitative estimates of the impact of the vaccination campaign on the spread of outbreaks of H5N1 in northern Vietnam. We find a substantial decrease in the transmissibility of infection between communes following vaccination. This was coupled with a significant increase in the time from infection to detection of the outbreak. Using a cladistic approach we estimated that, according to the posterior mean effect of pruning the reconstructed epidemic tree, two thirds of the outbreaks in 2007 could be attributed to this decrease in the rate of reporting. The net impact of these two effects was a less intense but longer-lasting wave and, whilst not sufficient to prevent the sustained spread of outbreaks, an overall reduction in the likelihood of the transmission of infection between communes. These findings highlight the need for more effectively targeted surveillance in order to help ensure that the effective coverage achieved by mass vaccination is converted into a reduction in the likelihood of outbreaks occurring which is sufficient to control the spread of H5N1 in Vietnam.

  15. Relative accuracy of spatial predictive models for lynx Lynx canadensis derived using logistic regression-AIC, multiple criteria evaluation and Bayesian approaches

    Institute of Scientific and Technical Information of China (English)

    Hejun KANG; Shelley M.ALEXANDER

    2009-01-01

    We compared probability surfaces derived using one set of environmental variables in three Geographic Information Systems (GIS) -based approaches: logistic regression and Akaike's Information Criterion (AIC),Multiple Criteria Evaluation (MCE),and Bayesian Analysis (specifically Dempster-Shafer theory). We used lynx Lynx canadensis as our focal species,and developed our environment relationship model using track data collected in Banff National Park,Alberta,Canada,during winters from 1997 to 2000. The accuracy of the three spatial models were compared using a contingency table method. We determined the percentage of cases in which both presence and absence points were correctly classified (overall accuracy),the failure to predict a species where it occurred (omission error) and the prediction of presence where there was absence (commission error). Our overall accuracy showed the logistic regression approach was the most accurate (74.51% ). The multiple criteria evaluation was intermediate (39.22%),while the Dempster-Shafer (D-S) theory model was the poorest (29.90%). However,omission and commission error tell us a different story: logistic regression had the lowest commission error,while D-S theory produced the lowest omission error. Our results provide evidence that habitat modellers should evaluate all three error measures when ascribing confidence in their model. We suggest that for our study area at least,the logistic regression model is optimal. However,where sample size is small or the species is very rare,it may also be useful to explore and/or use a more ecologically cautious modelling approach (e.g. Dempster-Shafer) that would over-predict,protect more sites,and thereby minimize the risk of missing critical habitat in conservation plans.

  16. The prevalences of Salmonella Genomic Island 1 variants in human and animal Salmonella Typhimurium DT104 are distinguishable using a Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Alison E Mather

    Full Text Available Throughout the 1990 s, there was an epidemic of multidrug resistant Salmonella Typhimurium DT104 in both animals and humans in Scotland. The use of antimicrobials in agriculture is often cited as a major source of antimicrobial resistance in pathogenic bacteria of humans, suggesting that DT104 in animals and humans should demonstrate similar prevalences of resistance determinants. Until very recently, only the application of molecular methods would allow such a comparison and our understanding has been hindered by the fact that surveillance data are primarily phenotypic in nature. Here, using large scale surveillance datasets and a novel Bayesian approach, we infer and compare the prevalence of Salmonella Genomic Island 1 (SGI1, SGI1 variants, and resistance determinants independent of SGI1 in animal and human DT104 isolates from such phenotypic data. We demonstrate differences in the prevalences of SGI1, SGI1-B, SGI1-C, absence of SGI1, and tetracycline resistance determinants independent of SGI1 between these human and animal populations, a finding that challenges established tenets that DT104 in domestic animals and humans are from the same well-mixed microbial population.

  17. A Systems Science Approach to Understanding Polytrauma and Blast-Related Injury: Bayesian Network Model of Data From a Survey of the Florida National Guard.

    Science.gov (United States)

    Toyinbo, Peter A; Vanderploeg, Rodney D; Belanger, Heather G; Spehar, Andrea M; Lapcevic, William A; Scott, Steven G

    2017-01-15

    We sought to further define the epidemiology of the complex, multiple injuries collectively known as polytrauma/blast-related injury (PT/BRI). Using a systems science approach, we performed Bayesian network modeling to find the most accurate representation of the complex system of PT/BRI and identify key variables for understanding the subsequent effects of blast exposure in a sample of Florida National Guard members (1,443 deployed to Operation Enduring Freedom/Operation Iraqi Freedom and 1,655 not deployed) who completed an online survey during the period from 2009 to 2010. We found that postdeployment symptoms reported as present at the time of the survey were largely independent of deployment per se. Blast exposure, not mild traumatic brain injury (TBI), acted as the primary military deployment-related driver of PT/BRI symptoms. Blast exposure was indirectly linked to mild TBI via other deployment-related traumas and was a significant risk for a high level of posttraumatic stress disorder (PTSD) arousal symptoms. PTSD arousal symptoms and tinnitus were directly dependent upon blast exposure, with both acting as bridge symptoms to other postdeployment mental health and physical symptoms, respectively. Neurobehavioral or postconcussion-like symptoms had no significant dependence relationship with mild TBI, but they were synergistic with blast exposure in influencing PTSD arousal symptoms. A replication of this analysis using a larger PT/BRI database is warranted.

  18. High precision dating of mass extinction events: a combined zircon geochronology, apatite tephrochronology, and Bayesian age modelling approach of the Permian-Triassic boundary extinction

    Science.gov (United States)

    Baresel, Björn; Bucher, Hugo; Brosse, Morgane; Bagherpour, Borhan; Schaltegger, Urs

    2016-04-01

    Chemical abrasion isotope dilution thermal ionization mass spectrometry (CA-ID-TIMS) U-Pb dating of single-zircon crystals is preferably applied to tephra beds intercalated in sedimentary sequences. By assuming that the zircon crystallization age closely approximate that of the volcanic eruption and ash deposition, U-Pb zircon geochronology is the preferred approach for dating mass extinction events (such as the Permian-Triassic boundary mass extinction) in the sedimentary record. As tephra from large volcanic eruptions is often transported over long distances, it additionally provide an invaluable tool for stratigraphic correlation across distant geologic sections. Therefore, the combination of high-precision zircon geochronology with apatite chemistry of the same tephra bed (so called apatite tephrochronology) provides a robust fingerprint of one particular volcanic eruption. In addition we provide coherent Bayesian model ages for the Permian-Triassic boundary (PTB) mass extinction, then compare it with PTB model ages at Meishan after Burgess et al. (2014). We will present new high-precision U-Pb zircon dates for a series of volcanic ash beds in deep- and shallow-marine Permian-Triassic sections in the Nanpanjiang Basin, South China. In addition, apatite crystals out of the same ash beds were analysed focusing on their halogen (F, Cl) and trace-element (e.g. Fe, Mg, REE) chemistry. We also show that Bayesian age models produce reproducible results from different geologic sections. On the basis of these data, including litho- and biostratigraphic correlations, we can precisely and accurately constrain the Permian-Triassic boundary in an equatorial marine setting, and correlate tephra beds over different sections and facies in the Nanpanjiang Basin independently from litho-, bio- or chemostratigraphic criteria. The results evidence that data produced in laboratories associated to the global EARTHTIME consortium can provide age information at the 0.05% level of 206

  19. Rasch Model Parameter Estimation in the Presence of a Nonnormal Latent Trait Using a Nonparametric Bayesian Approach

    Science.gov (United States)

    Finch, Holmes; Edwards, Julianne M.

    2016-01-01

    Standard approaches for estimating item response theory (IRT) model parameters generally work under the assumption that the latent trait being measured by a set of items follows the normal distribution. Estimation of IRT parameters in the presence of nonnormal latent traits has been shown to generate biased person and item parameter estimates. A…

  20. Calibration of environmental radionuclide transfer models using a Bayesian approach with Markov chain Monte Carlo simulations and model comparisons - Calibration of radionuclides transfer models in the environment using a Bayesian approach with Markov chain Monte Carlo simulation and comparison of models

    Energy Technology Data Exchange (ETDEWEB)

    Nicoulaud-Gouin, V.; Giacalone, M.; Gonze, M.A. [Institut de Radioprotection et de Surete Nucleaire-PRP-ENV/SERIS/LM2E (France); Martin-Garin, A.; Garcia-Sanchez, L. [IRSN-PRP-ENV/SERIS/L2BT (France)

    2014-07-01

    Calibration of transfer models according to observation data is a challenge, especially if parameters uncertainty is required, and if competing models should be decided between them. Generally two main calibration methods are used: The frequentist approach in which the unknown parameter of interest is supposed fixed and its estimation is based on the data only. In this category, least squared method has many restrictions in nonlinear models and competing models need to be nested in order to be compared. The bayesian inference in which the unknown parameter of interest is supposed random and its estimation is based on the data and on prior information. Compared to frequentist method, it provides probability density functions and therefore pointwise estimation with credible intervals. However, in practical cases, Bayesian inference is a complex problem of numerical integration, which explains its low use in operational modeling including radioecology. This study aims to illustrate the interest and feasibility of Bayesian approach in radioecology particularly in the case of ordinary differential equations with non-constant coefficients models, which cover most radiological risk assessment models, notably those implemented in the Symbiose platform (Gonze et al, 2010). Markov Chain Monte Carlo (MCMC) method (Metropolis et al., 1953) was used because the posterior expectations are intractable integrals. The invariant distribution of the parameters was performed by the metropolis-Hasting algorithm (Hastings, 1970). GNU-MCSim software (Bois and Maszle, 2011) a bayesian hierarchical framework, was used to deal with nonlinear differential models. Two case studies including this type of model were investigated: An Equilibrium Kinetic sorption model (EK) (e.g. van Genuchten et al, 1974), with experimental data concerning {sup 137}Cs and {sup 85}Sr sorption and desorption in different soils studied in stirred flow-through reactors. This model, generalizing the K{sub d} approach

  1. Interacting agricultural pests and their effect on crop yield: application of a Bayesian decision theory approach to the joint management of Bromus tectorum and Cephus cinctus.

    Directory of Open Access Journals (Sweden)

    Ilai N Keren

    Full Text Available Worldwide, the landscape homogeneity of extensive monocultures that characterizes conventional agriculture has resulted in the development of specialized and interacting multitrophic pest complexes. While integrated pest management emphasizes the need to consider the ecological context where multiple species coexist, management recommendations are often based on single-species tactics. This approach may not provide satisfactory solutions when confronted with the complex interactions occurring between organisms at the same or different trophic levels. Replacement of the single-species management model with more sophisticated, multi-species programs requires an understanding of the direct and indirect interactions occurring between the crop and all categories of pests. We evaluated a modeling framework to make multi-pest management decisions taking into account direct and indirect interactions among species belonging to different trophic levels. We adopted a Bayesian decision theory approach in combination with path analysis to evaluate interactions between Bromus tectorum (downy brome, cheatgrass and Cephus cinctus (wheat stem sawfly in wheat (Triticum aestivum systems. We assessed their joint responses to weed management tactics, seeding rates, and cultivar tolerance to insect stem boring or competition. Our results indicated that C. cinctus oviposition behavior varied as a function of B. tectorum pressure. Crop responses were more readily explained by the joint effects of management tactics on both categories of pests and their interactions than just by the direct impact of any particular management scheme on yield. In accordance, a C. cinctus tolerant variety should be planted at a low seeding rate under high insect pressure. However as B. tectorum levels increase, the C. cinctus tolerant variety should be replaced by a competitive and drought tolerant cultivar at high seeding rates despite C. cinctus infestation. This study exemplifies the

  2. Evaluation of the Performance of Five Diagnostic Tests for Fasciola hepatica Infection in Naturally Infected Cattle Using a Bayesian No Gold Standard Approach

    Science.gov (United States)

    Sargison, Neil; Kelly, Robert F.; Bronsvoort, Barend M. deC.; Handel, Ian

    2016-01-01

    The clinical and economic importance of fasciolosis has been recognised for centuries, yet diagnostic tests available for cattle are far from perfect. Test evaluation has mainly been carried out using gold standard approaches or under experimental settings, the limitations of which are well known. In this study, a Bayesian no gold standard approach was used to estimate the diagnostic sensitivity and specificity of five tests for fasciolosis in cattle. These included detailed liver necropsy including gall bladder egg count, faecal egg counting, a commercially available copro-antigen ELISA, an in-house serum excretory/secretory antibody ELISA and routine abattoir liver inspection. In total 619 cattle slaughtered at one of Scotland’s biggest abattoirs were sampled, during three sampling periods spanning summer 2013, winter 2014 and autumn 2014. Test sensitivities and specificities were estimated using an extension of the Hui Walter no gold standard model, where estimates were allowed to vary between seasons if tests were a priori believed to perform differently for any reason. The results of this analysis provide novel information on the performance of these tests in a naturally infected cattle population and at different times of the year where different levels of acute or chronic infection are expected. Accurate estimates of sensitivity and specificity will allow for routine abattoir liver inspection to be used as a tool for monitoring the epidemiology of F. hepatica as well as evaluating herd health planning. Furthermore, the results provide evidence to suggest that the copro-antigen ELISA does not cross-react with Calicophoron daubneyi rumen fluke parasites, while the serum antibody ELISA does. PMID:27564546

  3. Sparse Event Modeling with Hierarchical Bayesian Kernel Methods

    Science.gov (United States)

    2016-01-05

    SECURITY CLASSIFICATION OF: The research objective of this proposal was to develop a predictive Bayesian kernel approach to model count data based on...several predictive variables. Such an approach, which we refer to as the Poisson Bayesian kernel model, is able to model the rate of occurrence of... kernel methods made use of: (i) the Bayesian property of improving predictive accuracy as data are dynamically obtained, and (ii) the kernel function

  4. Bayesian non- and semi-parametric methods and applications

    CERN Document Server

    Rossi, Peter

    2014-01-01

    This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number

  5. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  6. A Bayesian approach for inferring the dynamics of partially observed endemic infectious diseases from space-time-genetic data

    Science.gov (United States)

    Mollentze, Nardus; Nel, Louis H.; Townsend, Sunny; le Roux, Kevin; Hampson, Katie; Haydon, Daniel T.; Soubeyrand, Samuel

    2014-01-01

    We describe a statistical framework for reconstructing the sequence of transmission events between observed cases of an endemic infectious disease using genetic, temporal and spatial information. Previous approaches to reconstructing transmission trees have assumed all infections in the study area originated from a single introduction and that a large fraction of cases were observed. There are as yet no approaches appropriate for endemic situations in which a disease is already well established in a host population and in which there may be multiple origins of infection, or that can enumerate unobserved infections missing from the sample. Our proposed framework addresses these shortcomings, enabling reconstruction of partially observed transmission trees and estimating the number of cases missing from the sample. Analyses of simulated datasets show the method to be accurate in identifying direct transmissions, while introductions and transmissions via one or more unsampled intermediate cases could be identified at high to moderate levels of case detection. When applied to partial genome sequences of rabies virus sampled from an endemic region of South Africa, our method reveals several distinct transmission cycles with little contact between them, and direct transmission over long distances suggesting significant anthropogenic influence in the movement of infected dogs. PMID:24619442

  7. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  8. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  9. Bayesian Persuasion

    OpenAIRE

    Emir Kamenica; Matthew Gentzkow

    2009-01-01

    When is it possible for one person to persuade another to change her action? We take a mechanism design approach to this question. Taking preferences and initial beliefs as given, we introduce the notion of a persuasion mechanism: a game between Sender and Receiver defined by an information structure and a message technology. We derive necessary and sufficient conditions for the existence of a persuasion mechanism that strictly benefits Sender. We characterize the optimal mechanism. Finally, ...

  10. Bayesian Causal Induction

    CERN Document Server

    Ortega, Pedro A

    2011-01-01

    Discovering causal relationships is a hard task, often hindered by the need for intervention, and often requiring large amounts of data to resolve statistical uncertainty. However, humans quickly arrive at useful causal relationships. One possible reason is that humans use strong prior knowledge; and rather than encoding hard causal relationships, they encode beliefs over causal structures, allowing for sound generalization from the observations they obtain from directly acting in the world. In this work we propose a Bayesian approach to causal induction which allows modeling beliefs over multiple causal hypotheses and predicting the behavior of the world under causal interventions. We then illustrate how this method extracts causal information from data containing interventions and observations.

  11. Perturbation Detection Through Modeling of Gene Expression on a Latent Biological Pathway Network: A Bayesian hierarchical approach.

    Science.gov (United States)

    Pham, Lisa M; Carvalho, Luis; Schaus, Scott; Kolaczyk, Eric D

    Cellular response to a perturbation is the result of a dynamic system of biological variables linked in a complex network. A major challenge in drug and disease studies is identifying the key factors of a biological network that are essential in determining the cell's fate. Here our goal is the identification of perturbed pathways from high-throughput gene expression data. We develop a three-level hierarchical model, where (i) the first level captures the relationship between gene expression and biological pathways using confirmatory factor analysis, (ii) the second level models the behavior within an underlying network of pathways induced by an unknown perturbation using a conditional autoregressive model, and (iii) the third level is a spike-and-slab prior on the perturbations. We then identify perturbations through posterior-based variable selection. We illustrate our approach using gene transcription drug perturbation profiles from the DREAM7 drug sensitivity predication challenge data set. Our proposed method identified regulatory pathways that are known to play a causative role and that were not readily resolved using gene set enrichment analysis or exploratory factor models. Simulation results are presented assessing the performance of this model relative to a network-free variant and its robustness to inaccuracies in biological databases.

  12. Optimal Skin-to-Stone Distance Is a Positive Predictor for Successful Outcomes in Upper Ureter Calculi following Extracorporeal Shock Wave Lithotripsy: A Bayesian Model Averaging Approach.

    Directory of Open Access Journals (Sweden)

    Kang Su Cho

    Full Text Available To investigate whether skin-to-stone distance (SSD, which remains controversial in patients with ureter stones, can be a predicting factor for one session success following extracorporeal shock wave lithotripsy (ESWL in patients with upper ureter stones.We retrospectively reviewed the medical records of 1,519 patients who underwent their first ESWL between January 2005 and December 2013. Among these patients, 492 had upper ureter stones that measured 4-20 mm and were eligible for our analyses. Maximal stone length, mean stone density (HU, and SSD were determined on pretreatment non-contrast computed tomography (NCCT. For subgroup analyses, patients were divided into four groups. Group 1 consisted of patients with SSD<25th percentile, group 2 consisted of patients with SSD in the 25th to 50th percentile, group 3 patients had SSD in the 50th to 75th percentile, and group 4 patients had SSD≥75th percentile.In analyses of group 2 patients versus others, there were no statistical differences in mean age, stone length and density. However, the one session success rate in group 2 was higher than other groups (77.9% vs. 67.0%; P = 0.032. The multivariate logistic regression model revealed that shorter stone length, lower stone density, and the group 2 SSD were positive predictors for successful outcomes in ESWL. Using the Bayesian model-averaging approach, longer stone length, lower stone density, and group 2 SSD can be also positive predictors for successful outcomes following ESWL.Our data indicate that a group 2 SSD of approximately 10 cm is a positive predictor for success following ESWL.

  13. A Bayesian approach for evaluation of the effect of water quality model parameter uncertainty on TMDLs: A case study of Miyun Reservoir.

    Science.gov (United States)

    Liang, Shidong; Jia, Haifeng; Xu, Changqing; Xu, Te; Melching, Charles

    2016-08-01

    Facing increasingly serious water pollution, the Chinese government is changing the environmental management strategy from solely pollutant concentration control to a Total Maximum Daily Load (TMDL) program, and water quality models are increasingly being applied to determine the allowable pollutant load in the TMDL. Despite the frequent use of models, few studies have focused on how parameter uncertainty in water quality models affect the allowable pollutant loads in the TMDL program, particularly for complicated and high-dimension water quality models. Uncertainty analysis for such models is limited by time-consuming simulation and high-dimensionality and nonlinearity in parameter spaces. In this study, an allowable pollutant load calculation platform was established using the Environmental Fluid Dynamics Code (EFDC), which is a widely applied hydrodynamic-water quality model. A Bayesian approach, i.e. the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, which is a high-efficiency, multi-chain Markov Chain Monte Carlo (MCMC) method, was applied to assess the effects of parameter uncertainty on the water quality model simulations and its influence on the allowable pollutant load calculation in the TMDL program. Miyun Reservoir, which is the most important surface drinking water source for Beijing, suffers from eutrophication and was selected as a case study. The relations between pollutant loads and water quality indicators are obtained through a graphical method in the simulation platform. Ranges of allowable pollutant loads were obtained according to the results of parameter uncertainty analysis, i.e. Total Organic Carbon (TOC): 581.5-1030.6t·yr(-1); Total Phosphorus (TP): 23.3-31.0t·yr(-1); and Total Nitrogen (TN): 480-1918.0t·yr(-1). The wide ranges of allowable pollutant loads reveal the importance of parameter uncertainty analysis in a TMDL program for allowable pollutant load calculation and margin of safety (MOS) determination. The sources

  14. A Hybrid Discriminative Approach with Bayesian Prior Constraint%贝叶斯先验约束下的混合判别方法

    Institute of Scientific and Technical Information of China (English)

    姚婷婷; 谢昭; 张骏; 高隽

    2015-01-01

    在有限样本下判别模型对训练样本敏感,易导致分类器学习结果泛化性能较弱,产生过拟合现象。针对上述问题,提出一种贝叶斯先验约束下的混合判别方法。通过在判别计算中引入生成先验分析,构建生成与判别模型在决策层的混合求解框架。该方法采用不同质分类器进行分类预测,并通过定义有效的融合机制进行样本筛选和标签决策,自动扩展训练集以更新模型,弥补训练样本信息的不完全性。有限样本下的场景分类实验结果表明,相比经典算法,该模型能够挖掘出具有高度判别特性的样本,从而进行有效的模型更新,纠正前期由于样本分布不均而导致的错分样本标签,提升场景分类精度。%The discriminative models are sensitive to limited training samples, which usually has poor generalization performances and is easily over-fitting. A hybrid discriminative approach with Bayesian prior constraints is proposed to solve this issue. By introducing the generative prior analysis into the discriminative approach, a complementary learning structure is built to fuse different classification results. The different types of classifiers are trained separately, and an effective fusion decision is defined to obtain the most confident testing samples along with the estimated labels. By enlarging the training set automatically, the model is updated to make up for the incomplete distribution information of training samples. The experimental results show that compared with the classical methods, the proposed approach can effectively update the model by figuring out the discriminating samples and correct the misclassifications caused by the uneven distribution of limited samples. It can improve the performances of scene categorization.

  15. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  16. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    . This allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion....... It is shown how half-cell potential measurements may be utilized to update the probability of excessive repair after 50 years....

  17. Bayesian Games with Intentions

    Directory of Open Access Journals (Sweden)

    Adam Bjorndahl

    2016-06-01

    Full Text Available We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.

  18. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  19. Bayesian Event Tree (BET) approach to Near Real Time monitoring on active volcanoes within ASI-SRV project: Mt. Etna test case

    Science.gov (United States)

    Silvestri, Malvina; Musacchio, Massimo; Taroni, Matteo; Fabrizia Buongiorno, Maria; Dini, Luigi

    2010-05-01

    ASI-Sistema Rischio Vulcanico (SRV) project is devoted to the development of a pre-operative integrated system managing different Earth Observation (EO) and Non EO data to respond to specific needs of the Italian Civil Protection Department (DPC) and improve the monitoring of Italian active volcanoes. The project provides the capability to maintain a repository where the acquired data are stored and generates products offering a support to risk managers during the different volcanic activity phases. All the products are obtained considering technical choices and developments of ASI-SRV based on flexible and scalable modules which take into account also the new coming space sensors and new processing algorithms. An important step of the project development regards the technical and scientific feasibility of the provided products that depends on the data availability, accuracy algorithms and models used in the processing and of course the possibility to validate the results by means of comparison with non-EO independent measurements. The multivariate analysis allows to perform multiple comparisons in order to have a first idea of which variables are largely preferentially or rather rarely distributed, also considering their geographic localization. The "Volcanic Parameter" cross correlation will allow to define the weight of each product that will be used as input in the BET-EF model (Bayesian Event Tree model for eruption forecasting ) which is an already developed algorithm for the eruption model, and will be adapt, as it is, to the ASI-SRV needs. The BET model represents a flexible tool to provide probabilities of any specific event at which we are interested in, by merging any kind of available and relevant information, such as theoretical models, a priori beliefs, monitoring measures, and past data. It is mainly based on a Bayesian procedure and it relies on the fuzzy approach to manage monitoring data. The method deals with short- and long-term forecasting

  20. A limited sampling method to estimate methotrexate pharmacokinetics in patients with rheumatoid arthritis using a Bayesian approach and the population data modeling program P-PHARM.

    Science.gov (United States)

    Bressolle, F; Bologna, C; Edno, L; Bernard, J C; Gomeni, R; Sany, J; Combe, B

    1996-01-01

    This paper describes a methodology to calculate methotrexate (MTX) pharmacokinetic parameters after intramuscular administration using two samples and the population parameters. Total and free MTX were measured over a 36-h period in 56 rheumatoid arthritis patients; 14 patients were studied after a two-dose scheme at 15-day intervals. The Hill equation was used to relate the free MTX to the total MTX changes in plasma concentrations, and a two-compartment open model was used to fit the total MTX plasma concentrations. A non-linear mixed effect procedure was used to estimate the population parameters and to explore the interindividual variability in relation to the following covariables: age, weight, height, haemoglobin, erythrocyte sedimentation rate, platelet count, creatinine clearance, rheumatoid factor, C-reactive protein, swelling joint count, and Ritchie's articular index. Population parameters were evaluated for 40 patients using a three-step approach. The population average parameters and the interindividual variabilities expressed as coefficients of variation (CV%) were: CL, 6.94 l center dot h-1 (20.5%); V, 34.8 l (32.2%); k12, 0.0838 h-1 (47.7%); k21, 0.0769 h-1 (61.6%); ka, 4.31 h-1 (58%); Emax, 1.12 mu mol center dot l-1 (19.7%); gamma, 0.932 (12.3%); and EC50, 2.14 mu mol center dot l-1 (27.3%). Thirty additional data sets (16 new patients and 14 patients of the previous population but treated on a separate occasion) were used to evaluate the predictive performance of the population parameters. Twelve blood samples were collected from each individual in order to calculate individual parameters using standard fitting procedures. These values were compared to the ones estimated using a Bayesian approach with population parameters as a priori information together with two samples, selected from the individual observations. The results show that the bias was not statistically different from zero and the precision of these parameters was excellent.

  1. Bayesian versus 'plain-vanilla Bayesian' multitarget statistics

    Science.gov (United States)

    Mahler, Ronald P. S.

    2004-08-01

    Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."

  2. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  3. Picturing classical and quantum Bayesian inference

    CERN Document Server

    Coecke, Bob

    2011-01-01

    We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers density operators rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. We characterize classical Bayesian inference in terms of a graphical property and demonstrate that our approach eliminates some purely conventional elements that appear in common representations thereof, such as whether degrees of belief are represented by probabilities or entropic quantities. We also introduce a quantum-like calculus wherein the Frobenius structure is noncommutative and show that it can accommodate Leifer's calculus of `cond...

  4. Comparison of linear mixed model analysis and genealogy-based haplotype clustering with a Bayesian approach for association mapping in a pedigreed population

    DEFF Research Database (Denmark)

    Dashab, Golam Reza; Kadri, Naveen Kumar; Mahdi Shariati, Mohammad;

    2012-01-01

    ) Mixed model analysis (MMA), 2) Random haplotype model (RHM), 3) Genealogy-based mixed model (GENMIX), and 4) Bayesian variable selection (BVS). The data consisted of phenotypes of 2000 animals from 20 sire families and were genotyped with 9990 SNPs on five chromosomes. Results: Out of the eight...

  5. A Bayesian Belief Network Approach to Explore Alternative Decisions for Sediment Control and water Storage Capacity at Lago Lucchetti, Puerto Rico

    Science.gov (United States)

    A Bayesian belief network (BBN) was developed to characterize the effects of sediment accumulation on the water storage capacity of Lago Lucchetti (located in southwest Puerto Rico) and to forecast the life expectancy (usefulness) of the reservoir under different management scena...

  6. The Bayesian bridge between simple and universal kriging

    Energy Technology Data Exchange (ETDEWEB)

    Omre, H.; Halvorsen, K.B. (Norwegian Computing Center, Oslo (Norway))

    1989-10-01

    Kriging techniques are suited well for evaluation of continuous, spatial phenomena. Bayesian statistics are characterized by using prior qualified guesses on the model parameters. By merging kriging techniques and Bayesian theory, prior guesses may be used in a spatial setting. Partial knowledge of model parameters defines a continuum of models between what is named simple and universal kriging in geostatistical terminology. The Bayesian approach to kriging is developed and discussed, and a case study concerning depth conversion of seismic reflection times is presented.

  7. Efficient Algorithms for Bayesian Network Parameter Learning from Incomplete Data

    Science.gov (United States)

    2015-07-01

    Efficient Algorithms for Bayesian Network Parameter Learning from Incomplete Data Guy Van den Broeck∗ and Karthika Mohan∗ and Arthur Choi and Adnan...We propose a family of efficient algorithms for learning the parameters of a Bayesian network from incomplete data. Our approach is based on recent...algorithms like EM (which require inference). 1 INTRODUCTION When learning the parameters of a Bayesian network from data with missing values, the

  8. Bayesian Source Separation and Localization

    CERN Document Server

    Knuth, K H

    1998-01-01

    The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...

  9. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  10. Using a Bayesian approach to estimate and compare new Keynesian DSGE models for the Brazilian economy: the role for endogenous persistence

    Directory of Open Access Journals (Sweden)

    Marcos Antonio C. da Silveira

    2008-09-01

    Full Text Available New Keynesian dynamic stochastic general equilibrium (DSGE models have been developed for monetary policy analysis in open economies. For this purpose, the basic model must be enriched with the sources of nominal and real rigidities which are capable of explaining the observed output and inflation persistence. Under this perspective, we use the Bayesian approach to estimate and compare alternative model specifications for the Brazilian economy with respect to two endogenous persistence mechanisms widely supported by the international empirical literature: habit formation and price indexation. Using data for the inflation target period, we conclude for the relevance of both mechanisms, although the evidence is unexpectly less robust for price indexation. Furthermore, impulse-response functions are built to describe the dynamic effects of domestic and foreign real and monetary shocks.Modelos de equilíbrio geral dinâmicos e estocásticos têm sido desenvolvidos para a análise de política monetária em economias abertas. Com este propósito, o modelo básico precisa ser enriquecido com as fontes de rigidez nominal e real que são capazes de explicar a persistência observada no produto e na inflação. Com esta perspectiva, a metodologia bayesiana é usada para estimar e comparar alternativas especificações de modelos para a economia brasileira no tocante a dois mecanismos endógenos de persistência amplamente postulados pela literatura empírica internacional: formação de hábito e indexação de preços. Usando dados do período de metas de inflação, nossa conclusão é pela relevância de ambos os mecanismos, embora a evidência seja inesperadamente menos robusta para indexação de preços. Além disso, funções impulsoresposta são construídas para descrever os efeitos dinâmicos de choques estruturais domésticos e externos, reais e monetários.

  11. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  12. Diagnosis of Subtraction Bugs Using Bayesian Networks

    Science.gov (United States)

    Lee, Jihyun; Corter, James E.

    2011-01-01

    Diagnosis of misconceptions or "bugs" in procedural skills is difficult because of their unstable nature. This study addresses this problem by proposing and evaluating a probability-based approach to the diagnosis of bugs in children's multicolumn subtraction performance using Bayesian networks. This approach assumes a causal network relating…

  13. Dynamic Bayesian Combination of Multiple Imperfect Classifiers

    CERN Document Server

    Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon

    2012-01-01

    Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...

  14. Konstruksi Bayesian Network Dengan Algoritma Bayesian Association Rule Mining Network

    OpenAIRE

    Octavian

    2015-01-01

    Beberapa tahun terakhir, Bayesian Network telah menjadi konsep yang populer digunakan dalam berbagai bidang kehidupan seperti dalam pengambilan sebuah keputusan dan menentukan peluang suatu kejadian dapat terjadi. Sayangnya, pengkonstruksian struktur dari Bayesian Network itu sendiri bukanlah hal yang sederhana. Oleh sebab itu, penelitian ini mencoba memperkenalkan algoritma Bayesian Association Rule Mining Network untuk memudahkan kita dalam mengkonstruksi Bayesian Network berdasarkan data ...

  15. Doing bayesian data analysis a tutorial with R and BUGS

    CERN Document Server

    Kruschke, John K

    2011-01-01

    There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all

  16. PAC-Bayesian Policy Evaluation for Reinforcement Learning

    CERN Document Server

    Fard, Mahdi MIlani; Szepesvari, Csaba

    2012-01-01

    Bayesian priors offer a compact yet general means of incorporating domain knowledge into many learning tasks. The correctness of the Bayesian analysis and inference, however, largely depends on accuracy and correctness of these priors. PAC-Bayesian methods overcome this problem by providing bounds that hold regardless of the correctness of the prior distribution. This paper introduces the first PAC-Bayesian bound for the batch reinforcement learning problem with function approximation. We show how this bound can be used to perform model-selection in a transfer learning scenario. Our empirical results confirm that PAC-Bayesian policy evaluation is able to leverage prior distributions when they are informative and, unlike standard Bayesian RL approaches, ignore them when they are misleading.

  17. Quantifying and Reducing Model-Form Uncertainties in Reynolds-Averaged Navier-Stokes Equations: An Open-Box, Physics-Based, Bayesian Approach

    CERN Document Server

    Xiao, H; Wang, J -X; Sun, R; Roy, C J

    2015-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering applications. For many practical flows, the turbulence models are by far the most important source of uncertainty. In this work we develop an open-box, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Si...

  18. Model Diagnostics for Bayesian Networks

    Science.gov (United States)

    Sinharay, Sandip

    2006-01-01

    Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…

  19. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  20. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  1. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    Science.gov (United States)

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…

  2. Non-homogeneous dynamic Bayesian networks for continuous data

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    2011-01-01

    Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with c

  3. Bayesian Student Modeling and the Problem of Parameter Specification.

    Science.gov (United States)

    Millan, Eva; Agosta, John Mark; Perez de la Cruz, Jose Luis

    2001-01-01

    Discusses intelligent tutoring systems and the application of Bayesian networks to student modeling. Considers reasons for not using Bayesian networks, including the computational complexity of the algorithms and the difficulty of knowledge acquisition, and proposes an approach to simplify knowledge acquisition that applies causal independence to…

  4. Hopes and Cautions in Implementing Bayesian Structural Equation Modeling

    Science.gov (United States)

    MacCallum, Robert C.; Edwards, Michael C.; Cai, Li

    2012-01-01

    Muthen and Asparouhov (2012) have proposed and demonstrated an approach to model specification and estimation in structural equation modeling (SEM) using Bayesian methods. Their contribution builds on previous work in this area by (a) focusing on the translation of conventional SEM models into a Bayesian framework wherein parameters fixed at zero…

  5. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, Denny; Wagenmakers, Eric-Jan; Romeijn, Jan-Willem

    2011-01-01

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer spe

  6. Bayesian modeling of unknown diseases for biosurveillance.

    Science.gov (United States)

    Shen, Yanna; Cooper, Gregory F

    2009-11-14

    This paper investigates Bayesian modeling of unknown causes of events in the context of disease-outbreak detection. We introduce a Bayesian approach that models and detects both (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A key contribution of this paper is that it introduces a Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has broad applicability in medical informatics, where the space of known causes of outcomes of interest is seldom complete.

  7. Bayesian Lensing Shear Measurement

    CERN Document Server

    Bernstein, Gary M

    2013-01-01

    We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...

  8. Bayesian psychometric scaling

    NARCIS (Netherlands)

    Fox, G.J.A.; Berg, van den S.M.; Veldkamp, B.P.; Irwing, P.; Booth, T.; Hughes, D.

    2015-01-01

    In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item resp

  9. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  10. Multi-Fraction Bayesian Sediment Transport Model

    Directory of Open Access Journals (Sweden)

    Mark L. Schmelter

    2015-09-01

    Full Text Available A Bayesian approach to sediment transport modeling can provide a strong basis for evaluating and propagating model uncertainty, which can be useful in transport applications. Previous work in developing and applying Bayesian sediment transport models used a single grain size fraction or characterized the transport of mixed-size sediment with a single characteristic grain size. Although this approach is common in sediment transport modeling, it precludes the possibility of capturing processes that cause mixed-size sediments to sort and, thereby, alter the grain size available for transport and the transport rates themselves. This paper extends development of a Bayesian transport model from one to k fractional dimensions. The model uses an existing transport function as its deterministic core and is applied to the dataset used to originally develop the function. The Bayesian multi-fraction model is able to infer the posterior distributions for essential model parameters and replicates predictive distributions of both bulk and fractional transport. Further, the inferred posterior distributions are used to evaluate parametric and other sources of variability in relations representing mixed-size interactions in the original model. Successful OPEN ACCESS J. Mar. Sci. Eng. 2015, 3 1067 development of the model demonstrates that Bayesian methods can be used to provide a robust and rigorous basis for quantifying uncertainty in mixed-size sediment transport. Such a method has heretofore been unavailable and allows for the propagation of uncertainty in sediment transport applications.

  11. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  12. Semisupervised learning using Bayesian interpretation: application to LS-SVM.

    Science.gov (United States)

    Adankon, Mathias M; Cheriet, Mohamed; Biem, Alain

    2011-04-01

    Bayesian reasoning provides an ideal basis for representing and manipulating uncertain knowledge, with the result that many interesting algorithms in machine learning are based on Bayesian inference. In this paper, we use the Bayesian approach with one and two levels of inference to model the semisupervised learning problem and give its application to the successful kernel classifier support vector machine (SVM) and its variant least-squares SVM (LS-SVM). Taking advantage of Bayesian interpretation of LS-SVM, we develop a semisupervised learning algorithm for Bayesian LS-SVM using our approach based on two levels of inference. Experimental results on both artificial and real pattern recognition problems show the utility of our method.

  13. Mercury and histopathology of the vulnerable goliath grouper, Epinephelus itajara, in U.S. waters: a multi-tissue approach.

    Science.gov (United States)

    Adams, Douglas H; Sonne, Christian

    2013-10-01

    Goliath grouper have undergone significant global population declines with potential biological extinction for some subpopulations. Although overfishing and habitat loss are important drivers of these declines, the negative effects of contaminants may also play a role. The life history patterns of goliath grouper may make this species especially prone to exposure to contaminants and may exacerbate bioaccumulation of toxic substances, including mercury, which has documented detrimental health effects. Therefore, we analyzed mercury (in muscle, liver, kidney, gonad, and brain tissue) and the histology of key organs (liver, kidney and gill tissue) in 56 goliath groupers from U.S. waters. Total mercury concentration was greatest in liver tissue, followed by kidney, muscle, gonad, and brain. Maximum mercury concentration ranged from 22.68 μg/g in liver tissue to 0.89 μg/g in brain tissue. Mean mercury concentration ranged from 2.87 μg/g in liver tissue to 0.37 μg/g in brain tissue with a mean of 0.63 μg/g in muscle. Mean mercury concentrations observed in goliath grouper from U.S. waters were within the range known to cause direct health effects in fish after long-term exposure. The lesions and histological changes observed in the liver, kidney, and gills of goliath groupers were similar to those found in other fish following laboratory mercury-exposure trials and to those found in mercury-contaminated fish in wild populations carrying similar or even lower concentrations. We suggest that exposure to mercury and other environmental influences such as pathogens and reduced temperatures could be co-factors in the histological effects or anomalies observed in the present study, and resulting stresses may be involved in the observed population declines in the species.

  14. Bayesian Vector Autoregressions with Stochastic Volatility

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1996-01-01

    This paper proposes a Bayesian approach to a vector autoregression with stochastic volatility, where the multiplicative evolution of the precision matrix is driven by a multivariate beta variate.Exact updating formulas are given to the nonlinear filtering of the precision matrix.Estimation of the au

  15. Communication cost in Distributed Bayesian Belief Networks

    NARCIS (Netherlands)

    Gosliga, S.P. van; Maris, M.G.

    2005-01-01

    In this paper, two different methods for information fusionare compared with respect to communication cost. These are the lambda-pi and the junction tree approach as the probability computing methods in Bayesian networks. The analysis is done within the scope of large distributed networks of computi

  16. Von Neumann Was Not a Quantum Bayesian

    CERN Document Server

    Stacey, Blake C

    2014-01-01

    Wikipedia has claimed for over two years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.

  17. A hierarchical Bayesian approach to the modified Bartlett-Lewis rectangular pulse model for a joint estimation of model