Mobile QR Code QR CODE

2024

Acceptance Ratio

21%

REFERENCES

1 
M. Kaess, H. Johannsson, R. Roberts, V. Ila, J. J. Leonard, and F. Dellaert, ``iSAM2: Incremental smoothing and mapping using the Bayes tree,'' The International Journal of Robotics Research, vol. 31, no. 2, pp. 216-235, 2012.DOI
2 
T. Qin, P. Li, and S. Shen, ``VINS-Mono: A robust and versatile monocular visual-inertial state estimator,'' IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004-1019, August 2018.DOI
3 
D. Solodar and I. Klein, ``VIO-DualProNet: Visual-inertial odometry with learning based process noise covariance,'' arXiv preprint arXiv:2308.11228, 2023.DOI
4 
C. Le Gentil, T. Vidal-Calleja, and S. Huang, ``IN2LAMA: Inertial LiDAR localisation and mapping,'' Proc. of IEEE International Conference on Robotics and Automation (ICRA), pp. 6388–6394, May 2019.DOI
5 
P. Geneva, K. Eckenhoff, W. Lee, Y. Yang, and G. Huang, ``OpenVINS: A research platform for visual-inertial estimation,'' Proc. of the IEEE International Conference on Robotics and Automation (ICRA), Paris, France, pp. 4666–4672, 2020.DOI
6 
J. Lv, K. Hu, J. Xu, Y. Liu, X. Ma, and X. Zuo, ``CLINS: Continuous-time trajectory estimation for LiDAR-inertial system,'' Proc. of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 6657–6663, 2021.DOI
7 
H. Ye, Y. Chen, and M. Liu, ``Tightly coupled 3D LiDAR inertial odometry and mapping,'' Proc. of the IEEE International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, pp. 3144–3150., May 2019DOI
8 
C. Chen, P. Geneva, Y. Peng, W. Lee, and G. Huang, ``Monocular visual-inertial odometry with planar regularities,'' Proc. of the IEEE International Conference on Robotics and Automation (ICRA), 2023.DOI
9 
P. Geneva, K. Eckenhoff, Y. Yang, and G. Huang, ``LIPS: LiDAR-inertial 3D plane SLAM,'' Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, October 2018.DOI
10 
W. Xu and F. Zhang, ``FAST-LIO: A fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter,'' IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3317–3324, April 2021.DOI
11 
T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, ``LIO-SAM: Tightly-coupled LiDAR-inertial odometry via smoothing and mapping,'' Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 2020.DOI
12 
T. M. Nguyen, M. Cao, S. Yuan, Y. Lyu, T. H. Nguyen, and L. Xie, ``LIRO: Tightly coupled LiDAR-inertia-ranging odometry,'' Proc. of the IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China, pp. 14484–14490, May–June 2021.DOI
13 
J. Zhang and S. Singh, ``Low-drift and real-time LiDAR odometry and mapping,'' Autonomous Robots, vol. 41, no. 2, pp. 401–416, February 2017.DOI
14 
H. Tang, X. Niu, T. Zhang, L. Wang, and J. Liu, ``LE-VINS: A robust solid-state-LiDAR-enhanced visual-inertial navigation system for low-speed robots,'' IEEE Transactions on Instrumentation and Measurement, vol. 72, 8502113, 2023.DOI
15 
J. Lin, C. Zheng, W. Xu, and F. Zhang, ``R$^2$LIVE: A robust, real-time, LiDAR-inertial-visual tightly-coupled state estimator and mapping,'' IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 7469–7476, 2021.DOI
16 
C. Zheng, Q. Zhu, W. Xu, X. Liu, Q. Guo, and F. Zhang, ``FAST-LIVO: Fast and tightly-coupled sparse-direct LiDAR-inertial-visual odometry,'' Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4003–4009, October 2022.DOI
17 
T. Shan, B. Englot, C. Ratti, and D. Rus, ``LVI-SAM: Tightly-coupled LiDAR-visual-inertial odometry via smoothing and mapping,'' Proc. of the IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China, pp. 5692–5698, May–June 2021.DOI
18 
Y. Jia, H. Luo, F. Zhao, G. Jiang, Y. Li, J. Yan, Z. Jiang, and Z. Wang, ``Lvio-fusion: A self-adaptive multi-sensor fusion SLAM framework using actor-critic method,'' Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 286–293, September 2021.DOI
19 
X. Zuo, P. Geneva, W. Lee, Y. Liu, and G. Huang, ``LIC-fusion: LiDAR-inertial-camera odometry,'' Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, November 2019.DOI
20 
Q. Fu, J. Wang, H. Yu, I. Ali, F. Guo, Y. He, and H. Zhang, ``PL-VINS: Real-time monocular visual-inertial SLAM with point and line features,'' arXiv preprint arXiv:2009.07462, 2020.DOI
21 
J. He, M. Li, Y. Wang, and H. Wang, ``PLE-SLAM: A visual-inertial SLAM based on point-line features and efficient IMU initialization,'' arXiv preprint arXiv:2401.01081, 2024.DOI
22 
X. Xu, L. Zhang, J. Yang, C. Cao, W. Wang, Y. Ran, Z. Tan, and M. Luo, ``A review of multi-sensor fusion SLAM systems based on 3D LiDAR,'' Remote Sensing, vol. 14, no. 12, 2835, 2022.DOI
23 
C. Debeunne and D. Vivet, ``A review of visual-LiDAR fusion based simultaneous localization and mapping,'' Sensors, vol. 20, no. 7, 2068, April 2020DOI
24 
M. Servières, V. Renaudin, A. Dupuis, and N. Antigny, ``Visual and visual-inertial SLAM: State of the art, classification, and experimental benchmarking,'' Journal of Sensors, vol. 2021, 2054828, pp. 1–26, 2021.DOI
25 
I. A. Abaspur Kazerouni, L. Fitzgerald, G. Dooly, and D. Toal, ``A survey of state-of-the-art on visual SLAM,'' Expert Systems with Applications, vol. 205, 117734, November 2022DOI