bibtype J - Journal Article
ARLID 0519831
utime 20241106135817.6
mtime 20200115235959.9
SCOPUS 85075330445
WOS 000497538000001
DOI 10.1080/03081079.2019.1692004
title (primary) (eng) Learning bipartite Bayesian networks under monotonicity restrictions
specification
page_count 24 s.
media_type P
serial
ARLID cav_un_epca*0256794
ISSN 0308-1079
title International Journal of General Systems
volume_id 49
volume 1 (2020)
page_num 88-111
publisher
name Taylor & Francis
keyword Bayesian networks
keyword Computerized adaptive testing
keyword Parameter learning
author (primary)
ARLID cav_un_auth*0329423
name1 Plajner
name2 Martin
institution UTIA-B
full_dept (cz) Matematická teorie rozhodování
full_dept (eng) Department of Decision Making Theory
department (cz) MTR
department (eng) MTR
full_dept Department of Decision Making Theory
country CZ
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
author
ARLID cav_un_auth*0101228
name1 Vomlel
name2 Jiří
institution UTIA-B
full_dept (cz) Matematická teorie rozhodování
full_dept Department of Decision Making Theory
department (cz) MTR
department MTR
full_dept Department of Decision Making Theory
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
source
source_type PDF
source_size 2.8 MB
url http://library.utia.cas.cz/separaty/2020/MTR/plajner-0519831.pdf
source
url https://www.tandfonline.com/doi/full/10.1080/03081079.2019.1692004
cas_special
project
project_id GA16-12010S
agency GA ČR
country CZ
ARLID cav_un_auth*0332303
project
project_id GA19-04579S
agency GA ČR
country CZ
ARLID cav_un_auth*0380558
project
project_id SGS17/198/OHK4/3T/14
agency ČVUT
country CZ
ARLID cav_un_auth*0361640
abstract (eng) Learning parameters of a probabilistic model is a necessary step in machine learning tasks. We present a method to improve learning from small datasets by using monotonicity conditions. Monotonicity simplifies the learning and it is often required by users. We present an algorithm for Bayesian Networks parameter learning. The algorithm and monotonicity conditions are described, and it is shown that with the monotonicity conditions we can better fit underlying data. Our algorithm is tested on artificial and empiric datasets. We use different methods satisfying monotonicity conditions: the proposed gradient descent, isotonic regression EM, and non-linear optimization. We also provide results of unrestricted EM and gradient descent methods. Learned models are compared with respect to their ability to fit data in terms of log-likelihood and their fit of parameters of the generating model. Our proposed method outperforms other methods for small sets, and provides better or comparable results for larger sets.
result_subspec WOS
RIV IN
FORD0 10000
FORD1 10200
FORD2 10201
reportyear 2021
num_of_auth 2
mrcbC52 4 A sml 4as 20241106135817.6
inst_support RVO:67985556
permalink http://hdl.handle.net/11104/0304816
confidential S
contract
name Publishing Agreement
date 20191111
mrcbC91 C
mrcbT16-e COMPUTERSCIENCETHEORYMETHODS
mrcbT16-i 0.20849
mrcbT16-j 0.43
mrcbT16-s 0.482
mrcbT16-B 44.296
mrcbT16-D Q3
mrcbT16-E Q3
arlyear 2020
mrcbTft \nSoubory v repozitáři: plajner-0519831 Accepted Author Publishing Agreement.pdf
mrcbU14 85075330445 SCOPUS
mrcbU24 PUBMED
mrcbU34 000497538000001 WOS
mrcbU56 PDF 2.8 MB
mrcbU63 cav_un_epca*0256794 International Journal of General Systems 0308-1079 1563-5104 Roč. 49 č. 1 2020 88 111 Taylor & Francis