bibtype J - Journal Article
ARLID 0642799
utime 20251209081835.5
mtime 20251209235959.9
SCOPUS 85190739311
WOS 001202833900001
DOI 10.14736/kyb-2024-1-0038
title (primary) (eng) Highly Robust Training of Regularized Radial Basis Function Networks
specification
page_count 22 s.
media_type P
serial
ARLID cav_un_epca*0297163
ISSN 0023-5954
title Kybernetika
volume_id 60
volume 1 (2024)
page_num 38-59
publisher
name Ústav teorie informace a automatizace AV ČR, v. v. i.
keyword effective regularization * * * *
keyword quantile regression
keyword regression neural networks
keyword robust training
keyword robustness
author (primary)
ARLID cav_un_auth*0345793
name1 Kalina
name2 Jan
institution UTIA-B
full_dept (cz) Stochastická informatika
full_dept (eng) Department of Stochastic Informatics
department (cz) SI
department (eng) SI
full_dept Department of Stochastic Informatics
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
author
ARLID cav_un_auth*0416837
name1 Vidnerová
name2 P.
country CZ
author
ARLID cav_un_auth*0435779
name1 Janáček
name2 P.
country CZ
source
url https://library.utia.cas.cz/separaty/2025/SI/kalina-0642799.pdf
cas_special
project
project_id GA24-10078S
agency GA ČR
country CZ
ARLID cav_un_auth*0472835
project
project_id GA22-02067S
agency GA ČR
country CZ
ARLID cav_un_auth*0435776
abstract (eng) Radial basis function (RBF) networks represent established tools for nonlinear regression modeling with numerous applications in various fields. Because their standard training is vulnerable with respect to the presence of outliers in the data, several robust methods for RBF network training have been proposed recently. This paper is interested in robust regularized RBF networks. A robust inter-quantile version of RBF networks based on trimmed least squares is proposed here. Then, a systematic comparison of robust regularized RBF networks follows, which is evaluated over a set of 405 networks trained using various combinations of robustness and regularization types. The experiments proceed with a particular focus on the effect of variable selection, which is performed by means of a backward procedure, on the optimal number of RBF units. The regularized inter-quantile RBF networks based on trimmed least squares turn out to outperform the competing approaches in the experiments if a highly robust prediction error measure is considered.
result_subspec WOS
RIV IN
FORD0 10000
FORD1 10200
FORD2 10201
reportyear 2026
inst_support RVO:67985556
permalink https://hdl.handle.net/11104/0372659
confidential S
mrcbC91 A
mrcbT16-e COMPUTERSCIENCE.CYBERNETICS
mrcbT16-f 1.1
mrcbT16-g 0.1
mrcbT16-h 14.7
mrcbT16-i 0.00058
mrcbT16-j 0.288
mrcbT16-k 978
mrcbT16-q 43
mrcbT16-s 0.378
mrcbT16-y 34
mrcbT16-x 2.47
mrcbT16-3 255
mrcbT16-4 Q3
mrcbT16-5 2.000
mrcbT16-6 43
mrcbT16-7 Q3
mrcbT16-C 40.3
mrcbT16-M 0.26
mrcbT16-N Q4
mrcbT16-P 40.3
arlyear 2024
mrcbU14 85190739311 SCOPUS
mrcbU24 PUBMED
mrcbU34 001202833900001 WOS
mrcbU63 cav_un_epca*0297163 Kybernetika Roč. 60 č. 1 2024 38 59 0023-5954 Ústav teorie informace a automatizace AV ČR, v. v. i.