TY - JOUR
T1 - Robust Low-Tubal-Rank Tensor Recovery From Binary Measurements
AU - Hou, Jingyao
AU - Zhang, Feng
AU - Qiu, Haiquan
AU - Wang, Jianjun
AU - Wang, Yao
AU - Meng, Deyu
N1 - Publisher Copyright:
© 1979-2012 IEEE.
PY - 2022/8/1
Y1 - 2022/8/1
N2 - Low-rank tensor recovery (LRTR) is a natural extension of low-rank matrix recovery (LRMR) to high-dimensional arrays, which aims to reconstruct an underlying tensor X from incomplete linear measurements M(X). However, LRTR ignores the error caused by quantization, limiting its application when the quantization is low-level. In this work, we take into account the impact of extreme quantization and suppose the quantizer degrades into a comparator that only acquires the signs of M(X). We still hope to recover X from these binary measurements. Under the tensor Singular Value Decomposition (t-SVD) framework, two recovery methods are proposed-the first is a tensor hard singular tube thresholding method; the second is a constrained tensor nuclear norm minimization method. These methods can recover a real n1×n2×n3 tensor X with tubal rank r from m random Gaussian binary measurements with errors decaying at a polynomial speed of the oversampling factor λ=m/((n1+n2)n3r). To improve the convergence rate, we develop a new quantization scheme under which the convergence rate can be accelerated to an exponential function of λ. Numerical experiments verify our results, and the applications to real-world data demonstrate the promising performance of the proposed methods.
AB - Low-rank tensor recovery (LRTR) is a natural extension of low-rank matrix recovery (LRMR) to high-dimensional arrays, which aims to reconstruct an underlying tensor X from incomplete linear measurements M(X). However, LRTR ignores the error caused by quantization, limiting its application when the quantization is low-level. In this work, we take into account the impact of extreme quantization and suppose the quantizer degrades into a comparator that only acquires the signs of M(X). We still hope to recover X from these binary measurements. Under the tensor Singular Value Decomposition (t-SVD) framework, two recovery methods are proposed-the first is a tensor hard singular tube thresholding method; the second is a constrained tensor nuclear norm minimization method. These methods can recover a real n1×n2×n3 tensor X with tubal rank r from m random Gaussian binary measurements with errors decaying at a polynomial speed of the oversampling factor λ=m/((n1+n2)n3r). To improve the convergence rate, we develop a new quantization scheme under which the convergence rate can be accelerated to an exponential function of λ. Numerical experiments verify our results, and the applications to real-world data demonstrate the promising performance of the proposed methods.
KW - Adaptivity
KW - Low-tubal-rank tensor
KW - One-bit tensor recovery
KW - Tensor hard singular tube thresholding
KW - Tensor nuclear norm minimization
UR - https://www.scopus.com/pages/publications/85102302814
U2 - 10.1109/TPAMI.2021.3063527
DO - 10.1109/TPAMI.2021.3063527
M3 - 文章
C2 - 33656988
AN - SCOPUS:85102302814
SN - 0162-8828
VL - 44
SP - 4355
EP - 4373
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
IS - 8
ER -