• 대한전기학회
Mobile QR Code QR CODE : The Transactions of the Korean Institute of Electrical Engineers
  • COPE
  • kcse
  • 한국과학기술단체총연합회
  • 한국학술지인용색인
  • Scopus
  • crossref
  • orcid

References

1 
Joohyuk Lee, Ho Jun Lee, Sasanka Kuruppu Arachchige, Namyoung Kim, Hyeonjeong Heo, and Kyuman Lee, “Search Operations With Geolocation Estimation of Missing Persons Based on Real-time Drone Images,” Journal of Institute of Control, Robotics and Systems, vol. 30, no. 8, pp. 890~896, 2024. DOI : 10.5302/J.ICROS.2024.24.0087DOI
2 
Titterton, D. H. “Strapdown inertial navigation technology,” The Institution of Engineering and Technology, 2004. DOI : 10.1049/PBRA017EDOI
3 
So Jin Park, Seung Taek Kim, Yeoung Min Kim, Joo Han Lee, Jin Woo Song, and Eung Ju Kim, “GNSS-DR/INS Integrated Navigation Algorithm using GNSS Speed Information Fusion for Overcoming GNSS Denial Environment,” Journal of Institute of Control, Robotics and Systems, vol. 29, no. 1, pp. 72~79, 2023. DOI : 10.5302/J.ICROS.2023.22.0180DOI
4 
Chan Gook Park, Jaehyuck Cha, Yeongkwon Choe, Jaehong Lee, Hanyeol Lee, Kwangjin Kim, Seong Yun Cho, and Jin Woo Song, “Survey on Integrated Navigation System Based on Inertial Technology,” Journal of Institute of Control, Robotics and Systems, vol. 30, no. 4, pp. 448~463, 2024. DOI : 10.5302/J.ICROS.2024.24.0026DOI
5 
Mourikis, Anastasios I. and Stergios I. Roumeliotis, “A multi-state constraint Kalman filter for vision-aided inertial navigation,” Proceedings 2007 IEEE international conference on robotics and automation. IEEE, pp. 3565~3572, 2007. DOI : 10.1109/ROBOT.2007.364024.DOI
6 
Qin, Tong, Peiliang Li, and Shaojie Shen. “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE transactions on robotics, vol 34, no 4, pp. 1004~1020, 2018. DOI : 10.1109/TRO.2018.2853729DOI
7 
C. Campos, R. Elvira, J. J. G. Rodríguez, J. M. M. Montiel, and J. D. Tardós, “ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM,” in IEEE Transactions on Robotics, vol. 37, no. 6, pp. 1874~1890, 2021. DOI : 10.1109/TRO.2021.3075644DOI
8 
M. -M. Gurgu, J. P. Queralta, and T. Westerlund, “Vision-Based GNSS-Free Localization for UAVs in the Wild,” 2022 7th International Conference on Mechanical Engineering and Robotics Research (ICMERR), Krakow, Poland, pp. 7~12, 2022. DOI : 10.1109/ICMERR56497.2022.10097798DOI
9 
J. Kinnari, F. Verdoja, and V. Kyrki, “GNSS-denied geolocalization of UAVs by visual matching of onboard camera images with orthophotos,” 2021 20th International Conference on Advanced Robotics (ICAR), Ljubljana, Slovenia, pp. 555~562, 2021. DOI : 10.1109/ICAR53236.2021.9659333.DOI
10 
A. Yol, B. Delabarre, A. Dame, J. -É. Dartois, and E. Marchand, “Vision-based absolute localization for unmanned aerial vehicles,” 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, pp. 3429~3434, 2014. DOI : 10.1109/IROS.2014.6943040.DOI
11 
Bowring, B. R., “Transformation from spatial to geographical coordinates,” Survey review, vol. 23, no. 181, pp. 323~327, 1976. DOI : 10.1179/sre.1976.23.181.323DOI
12 
Mikhail, E. M. “Introduction to Modern Photogrammetry,” John Williey & sons, 2001.URL
13 
Hemerly, Elder M., “Automatic georeferencing of images acquired by UAV’s,” International Journal of Automation and Computing, vol. 11, no. 4, pp. 347~352, 2014. DOI : 10.1007/s11633-014-0799-0DOI
14 
Zhang, Zhengyou., “A flexible new technique for camera calibration,” IEEE Transactions on pattern analysis and machine intelligence, vol. 22 no. 11, pp. 1330~1334, 2000. DOI : 10.1109/34.888718.DOI
15 
Alcantarilla, Pablo F. and T. Solutions, “Fast explicit diffusion for accelerated features in nonlinear scale spaces,” IEEE Trans. Patt. Anal. Mach. Intell, vol. 34 no. 7, pp. 1281~1298, 2011. DOI : 10.5244/C.27.13DOI
16 
Alcantarilla, Pablo Fernández, Adrien Bartoli, and Andrew J. Davison. “KAZE features,” Computer Vision–ECCV 2012: 12th European Conference on Computer Vision, pp. 214~227, 2012. DOI : 10.1007/978-3-642-33783-3_16DOI
17 
Bian, Jia-Wang, et al., “GMS: Grid-Based Motion Statistics for Fast, Ultra-robust Feature Correspondence,” International Journal of Computer Vision, vol. 128, pp. 1580~1593, 2020. DOI : 10.1007/s11263-019-01280-3DOI
18 
D. Baráth, J. Noskova, M. Ivashechkin, and J. Matas, “MAGSAC++, a Fast, Reliable and Accurate Robust Estimator,” 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1301~1309, 2020. DOI : 10.1109/CVPR42600.2020.00138.DOI