Journal of Northeastern University(Natural Science) ›› 2025, Vol. 46 ›› Issue (4): 124-133.DOI: 10.12068/j.issn.1005-3026.2025.20230292

• Resources & Civil Engineering • Previous Articles    

Visual-Inertial-GNSS Tightly Coupled Navigation and Positioning Method with Fusion of Point and Line Features

Li-ming HE, Quan-you YUE, Zheng-lin QU, Yu ZHANG   

  1. School of Resources & Civil Engineering,Northeastern University,Shenyang 110819,China. Corresponding author: HE Li-ming,E-mail : heliming@mail. neu. edu. cn
  • Received:2023-10-16 Online:2025-04-15 Published:2025-07-01

Abstract:

A multi-sensor fusion positioning method was proposed to address the limitations of single-sensor localization in complex environments. In terms of vision, line features were added to point features to overcome the interference caused by repetitive textures in visual images. In the global navigation satellite system (GNSS), the introduction of carrier phase with higher accuracy was used to smooth the pseudorange observations, which improved the accuracy of single point positioning. The accuracy and stability of the algorithm were validated by using both public datasets and measured data. In both public datasets and actual data, the accuracy of the proposed method is improved by 32.2%, 23.3%, 24.5%, and 25.7%, 25.8%, and 14.1% in the XY, and Z directions, respectively, compared to the GVINS (visual inertial GNSS tightly coupled algorithm) in the geocentric coordinate system. In addition, in the environments where satellite signals are severely obstructed, the proposed method still has good positioning performance for a certain period of time, with a positioning accuracy of 0.74 m in plane and 0.91m in elevation. Research results provide new insights for multi-sensor fusion position in complex environments.

Key words: visual-inertial odometry, line feature, carrier phase smoothed pseudorange, graph optimization, tightly coupled

CLC Number: