Journal of Northeastern University(Natural Science) ›› 2021, Vol. 42 ›› Issue (9): 1261-1267.DOI: 10.12068/j.issn.1005-3026.2021.09.007

• Information & Control • Previous Articles     Next Articles

Pedestrian Detection Based on Semantic Segmentation Attention and Visible Region Prediction

WANG Lu1, WANG Shuai1, ZHANG Guo-feng1, XU Li-sheng2,3   

  1. 1. School of Computer Science & Engineering, Northeastern University, Shenyang 110169, China; 2. School of Medicine and Biological Information Engineering, Northeastern University, Shenyang 110169, China; 3. Neusoft Research of Intelligent Healthcare Technology, Co., Ltd., Shenyang 110167, China.
  • Revised:2021-01-04 Accepted:2021-01-04 Published:2021-09-16
  • Contact: XU Li-sheng
  • About author:-
  • Supported by:
    -

Abstract: To improve the detection performance on occluded and small pedestrians in images, a pedestrian detection method based on semantic segmentation attention and visible region prediction was proposed. Specifically, based on the single shot multi-box detector(SSD)object detection network, the hyperparameter setting of the SSD was firstly optimized to make it more suitable for pedestrian detection. Then the semantic segmentation attention branch was introduced into the network to enhance the pedestrian detection features learned by the network. Finally, a detection prediction module which can simultaneously detect the full bodies and visible regions of pedestrians was developed. This module has the advantage of leveraging the features learned from visible regions to guide the learning of the full-body detection features, hence improving the overall detection accuracy. The experiment carried out on the Caltech pedestrian detection benchmark shows that the log-average miss rate of the proposed method is 5.5%, which is competitive compared with existing pedestrian detection approaches.

Key words: pedestrian detection; convolutional neural network; semantic segmentation attention(SSA); pedestrian visible region detection; multi-task network

CLC Number: