bibtype J - Journal Article
ARLID 0315684
utime 20240111140711.0
mtime 20081202235959.9
DOI 10.1002/acs.1080
title (primary) (eng) Use of Kullback–Leibler divergence for forgetting
specification
page_count 15 s.
media_type www
serial
ARLID cav_un_epca*0256772
ISSN 0890-6327
title International Journal of Adaptive Control and Signal Processing
volume_id 23
volume 1 (2009)
page_num 1-15
publisher
name Wiley
title (cze) Použití Kullback–Leibler divergence pro zapomínání
keyword Bayesian estimation
keyword Kullback–Leibler divergence
keyword functional approximation of estimation
keyword parameter tracking by stabilized forgetting
keyword ARX model
author (primary)
ARLID cav_un_auth*0101124
name1 Kárný
name2 Miroslav
institution UTIA-B
full_dept Department of Adaptive Systems
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
author
ARLID cav_un_auth*0101061
name1 Andrýsek
name2 Josef
institution UTIA-B
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
source
source_type pdf
url http://library.utia.cas.cz/separaty/2008/AS/karny-use%20of%20kullback-leibler%20divergence%20for%20forgetting.pdf
cas_special
project
project_id 2C06001
agency GA MŠk
ARLID cav_un_auth*0217685
project
project_id 1M0572
agency GA MŠk
ARLID cav_un_auth*0001814
project
project_id GA102/08/0567
agency GA ČR
ARLID cav_un_auth*0239566
research CEZ:AV0Z10750506
abstract (eng) Non-symmetric Kullback–Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686–690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, use the reversed order. This choice has the pragmatic motivation: recursive estimator often approximates the parametric model by a member of exponential family (EF) as it maps prior pdfs from the set of conjugate pdfs (CEF) back to the CEF. Approximations based on the KLD with the reversed order of arguments preserves this property. In the paper, the approximation performed within the CEF but with the proper order of arguments of the KLD is advocated. It is applied to the parameter tracking and performance improvements are demonstrated.
abstract (cze) Nesymetrická Kullback-Leiblerova divergence (KLD) měří blízkost pravděpodobnostních hustot. Dá se ukázat, že jedna z jejich verzí je teoreticky lepší. Článek popisuje využití této skutečnosti ke zlepšení techniky zapomínání.
reportyear 2009
RIV BB
permalink http://hdl.handle.net/11104/0165815
mrcbT16-f 1.478
mrcbT16-g 0.091
mrcbT16-h 6.5
mrcbT16-i 0.00231
mrcbT16-j 0.542
mrcbT16-k 671
mrcbT16-l 55
arlyear 2009
mrcbU56 pdf
mrcbU63 cav_un_epca*0256772 International Journal of Adaptive Control and Signal Processing 0890-6327 1099-1115 Roč. 23 č. 1 2009 1 15 Wiley