bibtype J - Journal Article
ARLID 0518308
utime 20241106135803.0
mtime 20191219235959.9
SCOPUS 85093097685
WOS 000587699700017
DOI 10.1109/TNNLS.2019.2956926
title (primary) (eng) Tensor Networks for Latent Variable Analysis: Novel Algorithms for Tensor Train Approximation
specification
page_count 17 s.
media_type P
serial
ARLID cav_un_epca*0382474
ISSN 2162-237X
title IEEE Transactions on Neural Networks and Learning Systems
volume_id 31
volume 11 (2020)
page_num 4622-4636
keyword Blind source separation
keyword tensor network (TN)
keyword image denoising
keyword nested Tucker
keyword tensor train (TT) decomposition
keyword Tucker-2 (TK2) decomposition
keyword truncated singular value decomposition (SVD)
author (primary)
ARLID cav_un_auth*0382249
name1 Phan
name2 A. H.
country RU
author
ARLID cav_un_auth*0382250
name1 Cichocki
name2 A.
country RU
author
ARLID cav_un_auth*0385705
name1 Uschmajew
name2 A.
country DE
author
ARLID cav_un_auth*0101212
name1 Tichavský
name2 Petr
institution UTIA-B
full_dept (cz) Stochastická informatika
full_dept Department of Stochastic Informatics
department (cz) SI
department SI
full_dept Department of Stochastic Informatics
fullinstit Ústav teorie informace a automatizace AV ČR, v. v. i.
author
ARLID cav_un_auth*0291813
name1 Luta
name2 G.
country US
source
url http://library.utia.cas.cz/separaty/2020/SI/tichavsky-0518308.pdf
source
url https://ieeexplore.ieee.org/document/8984730
cas_special
project
project_id GA17-00902S
agency GA ČR
ARLID cav_un_auth*0345929
abstract (eng) Decompositions of tensors into factor matrices, which interact through a core tensor, have found numerous applications in signal processing and machine learning. A more general tensor model that represents data as an ordered network of subtensors of order-2 or order-3 has, so far, not been widely considered in these fields, although this so-called tensor network (TN) decomposition has been long studied in quantum physics and scientific computing. In this article, we present novel algorithms and applications of TN decompositions, with a particular focus on the tensor train (TT) decomposition and its variants. The novel algorithms developed for the TT decomposition update, in an alternating way, one or several core tensors at each iteration and exhibit enhanced mathematical tractability and scalability for large-scale data tensors. For rigor, the cases of the given ranks, given approximation error, and the given error bound are all considered. The proposed algorithms provide well-balanced TT-decompositions and are tested in the classic paradigms of blind source separation from a single mixture, denoising, and feature extraction, achieving superior performance over the widely used truncated algorithms for TT decomposition.
result_subspec WOS
RIV BB
FORD0 20000
FORD1 20200
FORD2 20201
reportyear 2021
num_of_auth 6
mrcbC52 4 A sml 4as 20241106135803.0
inst_support RVO:67985556
permalink http://hdl.handle.net/11104/0303994
confidential S
contract
name Copyright Receipt
date 20191127
mrcbC86 1 Article Computer Science Artificial Intelligence|Computer Science Hardware Architecture|Computer Science Theory Methods|Engineering Electrical Electronic
mrcbC91 C
mrcbT16-e COMPUTERSCIENCEARTIFICIALINTELLIGENCE|COMPUTERSCIENCEHARDWAREARCHITECTURE|COMPUTERSCIENCETHEORYMETHODS|ENGINEERINGELECTRICALELECTRONIC
mrcbT16-i 10.20001
mrcbT16-j 2.949
mrcbT16-s 2.882
mrcbT16-B 95.913
mrcbT16-D Q1*
mrcbT16-E Q1*
arlyear 2020
mrcbTft \nSoubory v repozitáři: tichavsky-0518308-CopyrightReceipt.pdf
mrcbU14 85093097685 SCOPUS
mrcbU24 PUBMED
mrcbU34 000587699700017 WOS
mrcbU63 cav_un_epca*0382474 IEEE Transactions on Neural Networks and Learning Systems 2162-237X 2162-2388 Roč. 31 č. 11 2020 4622 4636