bibtype J - Journal Article
ARLID 0635531
utime 20250603104624.4
mtime 20250526235959.9
SCOPUS 105003665803
WOS 001485002800001
DOI 10.1016/j.ijar.2025.109454
title (primary) (eng) About twenty-five naughty entropies in belief function theory: Do they measure informativeness?
specification
page_count 15 s.
media_type P
serial
ARLID cav_un_epca*0256774
ISSN 0888-613X
title International Journal of Approximate Reasoning
volume_id 184
publisher
name Elsevier
keyword Belief functions
keyword Entropy
keyword Mutual information
keyword Divergence
author (primary)
ARLID cav_un_auth*0101118
name1 Jiroušek
name2 Radim
institution UTIA-B
full_dept (cz) Matematická teorie rozhodování
full_dept (eng) Department of Decision Making Theory
department (cz) MTR
department (eng) MTR
full_dept Department of Decision Making Theory
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
author
ARLID cav_un_auth*0216188
name1 Kratochvíl
name2 Václav
institution UTIA-B
full_dept (cz) Matematická teorie rozhodování
full_dept Department of Decision Making Theory
department (cz) MTR
department MTR
full_dept Department of Decision Making Theory
country CZ
garant K
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
source
url https://library.utia.cas.cz/separaty/2025/MTR/kratochvil-0635531-preprint.pdf
source
url https://www.sciencedirect.com/science/article/pii/S0888613X25000957?via%3Dihub
cas_special
abstract (eng) This paper addresses the long-standing challenge of identifying belief function entropies that can effectively guide model learning within the Dempster-Shafer theory of evidence. Building on the analogy with classical probabilistic approaches, we examine 25 entropy functions documented in the literature and evaluate their potential to define mutual information in the belief function framework. As conceptualized in probability theory, mutual information requires strictly subadditive entropies, which are inversely related to the informativeness of belief functions. After extensive analysis, we have found that none of the studied entropy functions fully satisfy these criteria. Nevertheless, certain entropy functions exhibit properties that may make them useful for heuristic model learning algorithms. This paper provides a detailed comparative study of these functions, explores alternative approaches using divergence-based measures, and offers insights into the design of information-theoretic tools for belief function models.
result_subspec WOS
RIV BA
FORD0 10000
FORD1 10100
FORD2 10103
reportyear 2026
num_of_auth 2
inst_support RVO:67985556
permalink https://hdl.handle.net/11104/0366809
confidential S
article_num 109454
mrcbC91 C
mrcbT16-e COMPUTERSCIENCEARTIFICIALINTELLIGENCE
mrcbT16-j 0.75
mrcbT16-s 0.877
mrcbT16-D Q3
mrcbT16-E Q2
arlyear 2025
mrcbU14 105003665803 SCOPUS
mrcbU24 PUBMED
mrcbU34 001485002800001 WOS
mrcbU63 cav_un_epca*0256774 International Journal of Approximate Reasoning 184 1 2025 0888-613X 1873-4731 Elsevier