[1] |
Mur-Artal R, Montiel J M M, Tardós J D. ORB-SLAM: a versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics, 2015, 31(5): 1147-1163.
|
[2] |
Shan T X, Englot B. LeGO-LOAM: lightweight and ground-optimized lidar odometry and mapping on variable terrain[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid: IEEE, 2018: 4758-4765.
|
[3] |
Mourikis A I, Roumeliotis S I. A multi-state constraint Kalman filter for vision-aided inertial navigation[C]//Proceedings 2007 IEEE International Conference on Robotics and Automation. Rome: IEEE, 2007: 3565-3572.
|
[4] |
Qin T, Li P L, Shen S J. VINS-Mono: a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020.
|
[5] |
Dang T, Mascarich F, Khattak S, et al. Autonomous search for underground mine rescue using aerial robots[C]//2020 IEEE Aerospace Conference. Big Sky: IEEE, 2020: 1-8.
|
[6] |
Azpurua H, Campos M F M, Macharet D G. Three-dimensional terrain aware autonomous exploration for subterranean and confined spaces[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). Xi’an: IEEE, 2021: 2443-2449.
|
[7] |
Campos C, Elvira R, Rodríguez J J G, et al. ORB-SLAM3: an accurate open-source library for visual, visual-inertial, and multimap SLAM[J]. IEEE Transactions on Robotics, 2021, 37(6): 1874-1890.
|
[8] |
Cao S Z, Lu X Y, Shen S J. GVINS: tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation[J]. IEEE Transactions on Robotics, 2022, 38(4): 2004-2021.
|
[9] |
Tang H L, Niu X J, Zhang T S, et al. LE-VINS: a robust solid-state-LiDAR-enhanced visual-inertial navigation system for low-speed robots[J]. IEEE Transactions on Instrumentation and Measurement, 2023, 72: 8502113.
|
[10] |
Song B Y, Yuan X F, Ying Z M, et al. DGM-VINS: visual-inertial SLAM for complex dynamic environments with joint geometry feature extraction and multiple object tracking[J]. IEEE Transactions on Instrumentation and Measurement, 2023, 72: 8503711.
|
[11] |
Li M L, Zhang H, Shen T A, et al. SM-VINS: a fast and decoupled monocular visual-inertial sensors SLAM system with stepwise marginalization[J]. IEEE Sensors Journal, 2024, 24(20): 33240-33251.
|
[12] |
Zhou H Z, Zou D P, Pei L, et al. StructSLAM: visual SLAM with building structure lines[J]. IEEE Transactions on Vehicular Technology, 2015, 64(4): 1364-1375.
|
[13] |
Forster C, Pizzoli M, Scaramuzza D. SVO: fast semi-direct monocular visual odometry[C]//2014 IEEE International Conference on Robotics and Automation(ICRA). Hong Kong: IEEE, 2014: 15-22.
|
[14] |
Gomez-Ojeda R, Briales J, Gonzalez-Jimenez J. PL-SVO: semi-direct monocular visual odometry by combining points and line segments[C]//2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Daejeon: IEEE, 2016: 4211-4216.
|
[15] |
Fu Q, Wang J, Yu H, et al. PL-VINS: real-time monocular visual-inertial SLAM with point and line features[J]. arXiv preprint arXiv, 2020: 2009.07462.
|
[16] |
Lim H, Kim Y, Jung K, et al. Avoiding degeneracy for monocular visual SLAM with point and line features[C]//2021 IEEE International Conference on Robotics and Automation (ICRA). Xi’an: IEEE, 2021: 11675-11681.
|
[17] |
Zhu Y Q, Jin R, Lou T S, et al. PLD-VINS: RGBD visual-inertial SLAM with point and line features[J]. Aerospace Science and Technology, 2021, 119: 107185.
|
[18] |
Lee J, Park S Y. PLF-VINS: real-time monocular visual-inertial SLAM with point-line fusion and parallel-line fusion[J]. IEEE Robotics and Automation Letters, 2021, 6(4): 7033-7040.
|
[19] |
Rublee E, Rabaud V, Konolige K, et al. ORB: an efficient alternative to SIFT or SURF [C]//2011 IEEE International Conference on Computer Vision. Barcelona: IEEE, 2011: 2564-2571.
|
[20] |
Von Gioi R G, Jakubowicz J, Morel J M, et al. LSD: a line segment detector[J]. Image Processing on Line, 2012, 2(4): 35-55.
|
[21] |
Hamid N, Khan N. LSM: perceptually accurate line segment merging[J]. Journal of Electronic Imaging, 2016, 25(6): 061620.
|