TY - GEN
T1 - Spiking Locality-Sensitive Hash
T2 - 2018 International Joint Conference on Neural Networks, IJCNN 2018
AU - Wang, Ziru
AU - Ma, Yongqiang
AU - Dong, Zhiwei
AU - Zheng, Nanning
AU - Ren, Pengju
N1 - Publisher Copyright:
© 2018 IEEE.
PY - 2018/10/10
Y1 - 2018/10/10
N2 - A novel similarity search method, named spiking locality sensitive hash (SLSH), a forward spiking neuron network(SNN) is proposed in this paper. The SLSH architecture is composed of successively connected encoding and fully connected layer. We optimize phase encoding to maximize the difference between corresponding pixels of any two different images. Then we test the performance of the encoding method and the SLSH model on graphic datasets. Experimental results prove that improved phase encoding method based on the difference exhibits the accuracy of 100%, 100% and 92%, which has superiority over previous phase encoding whose accuracies are 93%, 78% and 55% when the noise level is 5%, 20% and 40% respectively. Furthermore, experiments demonstrate that SLSH method is more capable than the traditional Locality-Sensitive Hash(LSH) and the FLY algorithm published in SCIENCE in similarity search. The mean average precision of SLSH is twice of FLY algorithm when the hash length is 5. In addition, the SLSH achieves a good recognition performance even under the influence of noise for MNIST, SVHN and SIFT datasets.
AB - A novel similarity search method, named spiking locality sensitive hash (SLSH), a forward spiking neuron network(SNN) is proposed in this paper. The SLSH architecture is composed of successively connected encoding and fully connected layer. We optimize phase encoding to maximize the difference between corresponding pixels of any two different images. Then we test the performance of the encoding method and the SLSH model on graphic datasets. Experimental results prove that improved phase encoding method based on the difference exhibits the accuracy of 100%, 100% and 92%, which has superiority over previous phase encoding whose accuracies are 93%, 78% and 55% when the noise level is 5%, 20% and 40% respectively. Furthermore, experiments demonstrate that SLSH method is more capable than the traditional Locality-Sensitive Hash(LSH) and the FLY algorithm published in SCIENCE in similarity search. The mean average precision of SLSH is twice of FLY algorithm when the hash length is 5. In addition, the SLSH achieves a good recognition performance even under the influence of noise for MNIST, SVHN and SIFT datasets.
KW - Locality-Sensitive Hash
KW - Similarity search
KW - phase encoding
KW - spiking neuron network
UR - https://www.scopus.com/pages/publications/85056498746
U2 - 10.1109/IJCNN.2018.8489085
DO - 10.1109/IJCNN.2018.8489085
M3 - 会议稿件
AN - SCOPUS:85056498746
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 8 July 2018 through 13 July 2018
ER -