Journal of Northeastern University(Natural Science) ›› 2025, Vol. 46 ›› Issue (5): 29-36.DOI: 10.12068/j.issn.1005-3026.2025.20230297

• Information & Control • Previous Articles     Next Articles

Perception Technology and Application of Complex Urban Traffic Environment Based on Target Detection

Aisan XIERAILI1, De-fu CHE1(), Duo WANG2, Tian YU3   

  1. 1.School of Resources & Civil Engineering,Northeastern University,Shenyang 110819,China
    2.Shen Kan Engineering & Technology Corporation,MCC,Shenyang 110167,China
    3.Feny Corporation Limited,Changsha 410600,China.
  • Received:2023-10-27 Online:2025-05-15 Published:2025-08-07
  • Contact: De-fu CHE

Abstract:

Machine vision-based environmental perception technology is one of the key tasks in the field of intelligent transportation. Traditional deep learning algorithms typically meet the detection needs of individual targets in simple scenarios. However, they are not capable of addressing the intelligent perception requirements in complex traffic environment. To improve the intelligent perception capability of vehicles in such environment, this paper proposes an improved YOLOv8 object detection network model, integrating attention mechanisms, optimizers, and deformable convolutional layers to achieve multi-target detection in complex urban traffic environment. To verify the effectiveness of the algorithm, comparative experiment were conducted using YOLOv4, YOLOv8, and the improved YOLOv8 algorithm on sample images from complex traffic environments. The results show that, compared to YOLOv4 and YOLOv8, the improved YOLOv8 algorithm increased the average accuracy by 40.76% and 16.92%, respectively. The detection accuracy and real-time performance of the improved YOLOv8 algorithm meet the practical application requirements, and through multi-sensor information fusion, it can realize intelligent perception in complex urban traffic environment.

Key words: YOLOv8, target detection, complex urban traffic, environment perception, intelligent transportation

CLC Number: