Laser & Optoelectronics Progress, Volume. 56, Issue 19, 191505(2019)
Novel Shoe Type Recognition Method Based on Convolutional Neural Network
Criminal investigation is often conducted based on the surveillance video footage and crime-scene shoeprint identification. The basic principle of this method is to infer the type of shoe worn by the suspect based on the shoeprints identified at the crime scene and to subsequently search for the shoe in the surveillance video footage. To solve the problem of low automation associated with this criminal investigation method, a new shoe type recognition method using a convolutional neural network has been proposed in this study. According to the unique characteristics of shoe type recognition, a framework of convolutional neural network is designed on the basis of DeepID, and a shoe database containing 50 pairs of shoes and 160231 images is constructed. The experiments are conducted based on the Caffe framework using different network models. Initially, the network structure comprises two convolution layers, two pooling layers, and two full connection layers. Further, experiments are conducted to compare the effects of the number of output elements in the first layer of two full connection layers on the network performance and training efficiency, and the experimental results of different network depths are compared without changing the size of the output feature graph as well. Based on the optimization model, the optimal network model is obtained by using overlapping pooling. The experimental results denote that the proposed method achieves an excellent performance, with an accuracy of 96.06%. Therefore, the proposed method can be considered to be a promising new method for shoe type recognition.
Get Citation
Copy Citation Text
Mengjing Yang, Yunqi Tang, Xiaojia Jiang. Novel Shoe Type Recognition Method Based on Convolutional Neural Network[J]. Laser & Optoelectronics Progress, 2019, 56(19): 191505
Category: Machine Vision
Received: Mar. 1, 2019
Accepted: Mar. 27, 2019
Published Online: Oct. 12, 2019
The Author Email: Tang Yunqi (yunqit@163.com)