bibtype C - Conference Paper (international conference)
ARLID 0507118
utime 20240103222343.1
mtime 20190731235959.9
title (primary) (eng) Density-Approximating Neural Network Models for Anomaly Detection
specification
page_count 8 s.
media_type E
serial
ARLID cav_un_epca*0507117
ISBN 978-1-4503-5552-0
title ACM SIGKDD 2018 Workshop
page_num 1-8
publisher
place New York
name ACM
year 2018
keyword neural network
keyword anomaly detection
author (primary)
ARLID cav_un_auth*0377825
name1 Flusser
name2 M.
country CZ
author
ARLID cav_un_auth*0307300
name1 Pevný
name2 T.
country CZ
author
ARLID cav_un_auth*0101197
name1 Somol
name2 Petr
institution UTIA-B
full_dept (cz) Rozpoznávání obrazu
full_dept Department of Pattern Recognition
department (cz) RO
department RO
full_dept Department of Pattern Recognition
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
source
url http://library.utia.cas.cz/separaty/2019/RO/somol-0507118.pdf
cas_special
abstract (eng) We propose an alternative use of neural models in anomaly detection. Traditionally, in anomaly detection context the common use of neural models is in form of auto-encoders. Through the use of auto-encoders the true anomality is proxied by reconstruction error. Auto-encoders often perform well but do not guarantee to perform as expected in all cases. A popular more direct way of modeling anomality distribution is through k-Nearest Neighbor models. Although kNN can perform better than auto-encoders in some cases, their applicability can be seriously impaired by their space and time complexity especially with high-dimensional large-scale data. The alternative we propose is to model the distribution imposed by kNN using neural networks. We show that such neural models are capable of achieving comparable accuracy to kNN while reducing computational complexity by orders of magnitude. The de-noising e ect of a neural model with limited number of neurons and layers is shown to lead to accuracy improvements in some cases. We evaluate the proposed idea against standard kNN and auto-encoders on a large set of benchmark data and show that in majority of cases it is possible to improve on accuracy or computational cost.
action
ARLID cav_un_auth*0377827
name ACM SIGKDD 2018 Workshop
dates 20180820
place London
country GB
RIV BC
FORD0 20000
FORD1 20200
FORD2 20204
reportyear 2020
num_of_auth 3
presentation_type PO
inst_support RVO:67985556
permalink http://hdl.handle.net/11104/0298560
mrcbC61 1
confidential S
arlyear 2018
mrcbU14 SCOPUS
mrcbU24 PUBMED
mrcbU34 WOS
mrcbU63 cav_un_epca*0507117 ACM SIGKDD 2018 Workshop 978-1-4503-5552-0 1 8 New York ACM 2018