bibtype |
C -
Conference Paper (international conference)
|
ARLID |
0350163 |
utime |
20240103194203.0 |
mtime |
20101129235959.9 |
WOS |
000290245400004 |
DOI |
10.1007/978-3-642-15825-4_4 |
title
(primary) (eng) |
Computational Properties of Probabilistic Neural Networks |
specification |
|
serial |
ARLID |
cav_un_epca*0350162 |
ISBN |
978-3-642-15818-6 |
title
|
Artificial Neural Networks – ICANN 2010 |
part_title |
Part III |
page_num |
31-40 |
publisher |
place |
Berlin Heidelberg |
name |
Springer Verlag |
year |
2010 |
|
editor |
name1 |
Diamantaras |
name2 |
K. |
|
editor |
name1 |
Duch |
name2 |
Wlodzislaw |
|
editor |
|
|
keyword |
Probabilistic neural networks |
keyword |
Statistical pattern recognition |
keyword |
Subspace approach |
keyword |
Overfitting reduction |
author
(primary) |
ARLID |
cav_un_auth*0101091 |
name1 |
Grim |
name2 |
Jiří |
full_dept (cz) |
Rozpoznávání obrazu |
full_dept (eng) |
Department of Pattern Recognition |
department (cz) |
RO |
department (eng) |
RO |
institution |
UTIA-B |
full_dept |
Department of Pattern Recognition |
fullinstit |
Ústav teorie informace a automatizace AV ČR, v. v. i. |
|
author
|
ARLID |
cav_un_auth*0230019 |
name1 |
Hora |
name2 |
Jan |
full_dept (cz) |
Rozpoznávání obrazu |
full_dept |
Department of Pattern Recognition |
department (cz) |
RO |
department |
RO |
institution |
UTIA-B |
fullinstit |
Ústav teorie informace a automatizace AV ČR, v. v. i. |
|
source |
|
cas_special |
project |
project_id |
GA102/07/1594 |
agency |
GA ČR |
ARLID |
cav_un_auth*0228611 |
|
project |
project_id |
2C06019 |
agency |
GA MŠk |
country |
CZ |
ARLID |
cav_un_auth*0216518 |
|
project |
project_id |
1M0572 |
agency |
GA MŠk |
ARLID |
cav_un_auth*0001814 |
|
research |
CEZ:AV0Z10750506 |
abstract
(eng) |
We discuss the problem of overfitting of probabilistic neural networks in the framework of statistical pattern recognition. The probabilistic approach to neural networks provides a statistically justified subspace method of classification. The underlying structural mixture model includes binary structural parameters and can be optimized by EM algorithm in full generality. Formally, the structural model reduces the number of parameters included and therefore the structural mixtures become less complex and less prone to overfitting. We illustrate how recognition accuracy and the effect of overfitting is influenced by mixture complexity and by the size of training data set. |
action |
ARLID |
cav_un_auth*0262947 |
name |
ICANN 2010. International Conference on Artificial Neural Networks /20./ |
place |
Thessaloniki |
dates |
15.09.2010-18.09.2010 |
country |
GR |
|
reportyear |
2011 |
RIV |
IN |
permalink |
http://hdl.handle.net/11104/0190237 |
arlyear |
2010 |
mrcbU34 |
000290245400004 WOS |
mrcbU63 |
cav_un_epca*0350162 Artificial Neural Networks – ICANN 2010 Part III 978-3-642-15818-6 31 40 Berlin Heidelberg Springer Verlag 2010 Lecture Notes in Computer Science Volume 6354 LNCS |
mrcbU67 |
Diamantaras K. 340 |
mrcbU67 |
Duch Wlodzislaw 340 |
mrcbU67 |
Iliadis L.S. 340 |
|