bibtype K - Conference Paper (Czech conference)
ARLID 0493355
utime 20240111141006.0
mtime 20180917235959.9
title (primary) (eng) Representations of Bayesian Networks by Low-Rank Models
specification
page_count 12 s.
media_type C
serial
ARLID cav_un_epca*0493354
ISSN Proceedings of Machine Learning Research
title Proceedings of Machine Learning Research
part_num 72
page_num 463-472
publisher
place Praha
name UTIA
year 2018
editor
name1 Kratochvíl
name2 Václav
editor
name1 Studený
name2 Milan
keyword canonical polyadic tensor decomposition
keyword conditional probability tables
keyword marginal probability tables
author (primary)
ARLID cav_un_auth*0101212
name1 Tichavský
name2 Petr
full_dept (cz) Stochastická informatika
full_dept (eng) Department of Stochastic Informatics
department (cz) SI
department (eng) SI
institution UTIA-B
full_dept Department of Stochastic Informatics
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
author
ARLID cav_un_auth*0101228
name1 Vomlel
name2 Jiří
full_dept (cz) Matematická teorie rozhodování
full_dept Department of Decision Making Theory
department (cz) MTR
department MTR
institution UTIA-B
full_dept Department of Decision Making Theory
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
source
url http://library.utia.cas.cz/separaty/2018/SI/tichavsky-0493355.pdf
source_size 326 kB
cas_special
project
project_id GA17-00902S
agency GA ČR
ARLID cav_un_auth*0345929
abstract (eng) Conditional probability tables (CPTs) of discrete valued random variables may achieve high dimensions and Bayesian networks defined as the product of these CPTs may become intractable by conventional methods of BN inference because of their dimensionality. In many cases, however, these probability tables constitute tensors of relatively low rank. Such tensors can be written in the so-called Kruskal form as a sum of rank-one components. Such representation would be equivalent to adding one artificial parent to all random variables and deleting all edges between the variables. The most difficult task is to find such a representation given a set of marginals or CPTs of the random variables under consideration. In the former case, it is a problem of joint canonical polyadic (CP) decomposition of a set of tensors. The latter fitting problem can be solved in a similar manner. We apply a recently proposed alternating direction method of multipliers (ADMM), which assures that the model has a probabilistic interpretation, i.e., that all elements of all factor matrices are nonnegative. We perform experiments with several well-known Bayesian networks.\n\n
action
ARLID cav_un_auth*0363930
name International Conference on Probabilistic Graphical Models
dates 20180911
mrcbC20-s 20180914
place Praha
country CZ
RIV BA
FORD0 10000
FORD1 10100
FORD2 10103
reportyear 2019
num_of_auth 2
presentation_type PR
inst_support RVO:67985556
permalink http://hdl.handle.net/11104/0286997
confidential S
arlyear 2018
mrcbU56 326 kB
mrcbU63 cav_un_epca*0493354 Proceedings of Machine Learning Research 72 UTIA 2018 Praha 463 472 1938-7228
mrcbU67 340 Kratochvíl Václav
mrcbU67 340 Studený Milan