This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Aguilar, Wilbert G., Verónica P. Casaliglla, and José L. Pólit, “Obstacle avoidance based-visual navigation for micro aerial vehicles”, Electronics 6.1 (2017): 10, pp. 1–23.Search in Google Scholar
Chao, H., et al., “A comparative study of optical flow and traditional sensors in UAV navigation”, in Proc. American Control Conference. IEEE, 2013, pp. 3858–3863.Search in Google Scholar
Mur-Artal, R., Montiel, J. M. M., and Tardos, J. D., “ORB-SLAM: a versatile and accurate monocular SLAM system”, IEEE transactions on robotics, vol 31, no. 5, pp. 1147–1163, 2015.Search in Google Scholar
Mostafa, M. M., et al., “A smart hybrid vision aided inertial navigation system approach for UAVs in a GNSS denied environment”, Navigation: Journal of The Institute of Navigation Vol. 65, no. 4, pp. 533–547, 2018.Search in Google Scholar
Shen, C., et al., “Optical flow sensor/INS/magnetometer integrated navigation system for MAV in GPS-denied environment”, Hindawi Publishing Corporation, Journal of Sensors, 2016, pp. 1–10.Search in Google Scholar
Pastor-Moreno, D., Shin, H. S., and Waldock, A., “Optical flow localisation and appearance mapping (OFLAAM) for long-term navigation.”, in Proc. 2015 International Conference on Unmanned Aircraft Systems (ICUAS), IEEE, 2015, pp. 980–988.Search in Google Scholar
Wei, W., et al., “A survey of uav visual navigation based on monocular slam”, in Proc. 2018 IEEE 4th Information Technology and Mechatronics Engineering Conference (ITOEC), IEEE, 2018, pp. 1849–1853.Search in Google Scholar
Jeon, J., et al., “Run your visual-inertial odometry on NVIDIA Jetson: Benchmark tests on a micro aerial vehicle”, IEEE Robotics and Automation Letters, vol 6, no. 3, pp. 5332–5339, 2021.Search in Google Scholar
Lowe, D. G., “Distinctive image features from scale-invariant keypoints”, International journal of computer vision, vol. 60, pp. 91–110, 2004.Search in Google Scholar
Bay, H., Tuytelaars, T., and Van Gool, L., “Surf: Speeded up robust features”, Lecture notes in computer science 3951, 2006, pp. 404–417.Search in Google Scholar
Rublee, E., et al., “ORB: An efficient alternative to SIFT or SURF”, in Proc. 2011 International conference on computer vision, IEEE, 2011, pp. 2564–2571.Search in Google Scholar
Chen, L., et al., “Design of a multi-sensor cooperation travel environment perception system for autonomous vehicle”, Sensors 12.9, 2012, pp. 12386–12404.Search in Google Scholar
Wenxuan, Z, Xiao, J, and Xin, T., “Integrated navigation system with monocular vision and LIDAR for indoor UAVs”, in Proc. 12th IEEE Conference on Industrial Electronics and Applications (ICIEA), IEEE, 2017, pp. 924-929.Search in Google Scholar
Srinivasan, M., et al., “Honeybee navigation en route to the goal: visual flight control and odometry”, The Journal of experimental biology, 199.1, 1996, pp. 237–244.Search in Google Scholar
Farnebäck, G., “Two-frame motion estimation based on polynomial expansion”, in Proc. 13 Image Analysis: 13th Scandinavian Conference, SCIA 2003, Halmstad, Sweden, June 29–July 2, 2003, Springer Berlin Heidelberg, 2003, pp. 363–370.Search in Google Scholar
Lucas, B. D., and Kanade, T. “An iterative image registration technique with an application to stereo vision”, in Proc. IJCAI’81: 7th international joint conference on Artificial intelligence, Vol. 2. 1981, pp. 674–679.Search in Google Scholar
Horn, B. K. P., and Schunck, B. G. “Determining optical flow”, Artificial intelligence 17.1-3, 1981, pp. 185–203.Search in Google Scholar
Srinivasan, M. V., “An image-interpolation technique for the computation of optic flow and egomotion”, Biological cybernetics 71.5, 1994, pp. 401–415.Search in Google Scholar
Farid, K., Fantoni, I., and Nonami, K. “Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles”, Robotics and autonomous systems, 57.6-7, 2009, pp. 591–602.Search in Google Scholar
Zhang, L., Xiong, Z., Lai, J., and Liu, J., “Research of optical flow aided MEMS navigation based on convex optimization and ROF denoising”, Optik, vol. 158, pp. 1575–1583, 2018.Search in Google Scholar
Boretti, C., et al. “Visual Navigation Using Sparse Optical Flow and Time-to-Transit”, in Proc. 2022 Intern. Conf. on Robotics and Automation (ICRA), IEEE, 2022, pp. 9397–9403.Search in Google Scholar
Li, L., Liang, S., and Zhang, Y., “Application research of moving target detection based on optical flow algorithms”, Journal of Physics: Conference Series, vol. 1237., no. 2., IOP Publishing, 2019, p. 022073.Search in Google Scholar
Zhu, A. Z., et al., “EV-FlowNet: Self-supervised optical flow estimation for event-based cameras”, arXiv preprint arXiv: pp. 1802.06898 (2018).Search in Google Scholar
Dosovitskiy, A., et al., “Flownet: Learning optical flow with convolutional networks”, in Proc. of the IEEE international conference on computer vision, 2015, pp. 2758–2766.Search in Google Scholar
Ilg, E., et al. “Flownet 2.0: Evolution of optical flow estimation with deep networks”, in Proc. of the IEEE conference on computer vision and pattern recognition, 2017, pp. 2462–2470.Search in Google Scholar
Sun, D., et al., “Pwc-net: Cnns for optical flow using pyramid, warping, and cost volume.” in Proc. of the IEEE conference on computer vision and pattern recognition, 2018.Search in Google Scholar
Zhichao, Y., and Shi, J., “Geonet: Unsupervised learning of dense depth, optical flow and camera pose”, in Proc. of the IEEE conference on computer vision and pattern recognition, 2018, pp. 1983–1992.Search in Google Scholar
Scharstein, D., et al., “High-resolution stereo datasets with subpixel-accurate ground truth”, in Proc. Pattern Recognition: 36th German Conference, GCPR 2014, Münster, Germany, September 2-5, 2014, Proceedings 36, Springer International Publishing, 2014, pp. 31–42.Search in Google Scholar
Geiger, A., et al., “Vision meets robotics: The kitti dataset”, The International Journal of Robotics Research 32.11, 2013, pp. 1231–1237.Search in Google Scholar
Butler, D. J., et al., “A naturalistic open source movie for optical flow evaluation.” in Proc. Computer Vision–ECCV 2012: 12th European Conference on Computer Vision, Florence, Italy, October 7-13, 2012, Proceedings, Part VI 12. Springer Berlin Heidelberg, 2012, pp. 611–625.Search in Google Scholar
Zhu, A. Z., et al. “The multivehicle stereo event camera dataset: An event camera dataset for 3D perception”, IEEE Robotics and Automation Letters, 3.3, 2018, pp. 2032–2039.Search in Google Scholar
Cordts, Marius, et al., “The cityscapes dataset for semantic urban scene understanding” in Proc. of the IEEE conference on computer vision and pattern recognition, 2016, pp. 3213-3223.Search in Google Scholar
Mumuni, F., Mumuni, A., and Amuzuvi, C. K., “Deep learning of monocular depth, optical flow and ego-motion with geometric guidance for UAV navigation in dynamic environments”, Machine Learning with Applications, 10, 2022, p. 100416.Search in Google Scholar
Zhang, J., et al., “Deep online correction for monocular visual odometry”, in Proc. 2021 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021, pp. 14396–14402.Search in Google Scholar
Taegyun, K., et al., “Improved optical sensor fusion in UAV navigation using feature point threshold filter”, International Journal of Aeronautical and Space Sciences, 2022, pp. 1-12.Search in Google Scholar
Yu, T., et al., “Accurate and robust stereo direct visual odometry for agricultural environment”, in Proc. 2021 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021, pp. 2480–2486.Search in Google Scholar
Pinggera, P., et al., “Know your limits: Accuracy of long range stereoscopic object measurements in practice”, in Proc. Computer Vision–ECCV 2014: 13th European Conference, Zurich, Switzerland, Sept. 6-12, 2014, Proceedings, Part II 13. Springer International Publishing, 2014, pp. 96–111.Search in Google Scholar
Guizilini, V., and Ramos, F., “Visual odometry learning for unmanned aerial vehicles”, in Proc. 2011 IEEE International Conference on Robotics and Automation, IEEE, 2011, pp. 6213–6220.Search in Google Scholar
Ciarfuglia, T. A., et al. “Evaluation of non-geometric methods for visual odometry”, Robotics and Autonomous Systems 62.12, 2014, pp. 1717–1730.Search in Google Scholar
Xu, Q., et al., “An Optical Flow Based Multi-Object Tracking Approach Using Sequential Convex Programming”, in Proc. 16th International Conference on Control, Automation, Robotics and Vision (ICARCV), IEEE, 2020, pp. 1216–1221.Search in Google Scholar
Schenk, F., and Fraundorfer, F., “Robust edge-based visual odometry using machine-learned edges”, in Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2017, pp. 1297–1304.Search in Google Scholar
He, Y., et al., “Picovo: A lightweight rgb-d visual odometry targeting resource-constrained iot devices”, in Proc. IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021, pp. 5567–5573.Search in Google Scholar
Santamaria-Navarro, A., et al., “Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors”, Autonomous robots, 42, 2018, pp. 1263–1280.Search in Google Scholar
Gálvez-López, D., and Tardos, J. D., “Bags of binary words for fast place recognition in image sequences”, IEEE Transactions on Robotics, 28.5, 2012, pp. 1188–1197.Search in Google Scholar
Zhuoning, D., Li, W., and Zhou, Y. “An autonomous navigation scheme for UAV in approach phase”, in Proc. IEEE Chinese Guidance, Navigation and Control Conference (CGNCC), IEEE, 2016, pp. 982–987.Search in Google Scholar
Ho, H. W., de Croon, G., and Chu, Q. P., “Distance and velocity estimation using optical flow from a monocular camera”, International Journal of Micro Air Vehicles, 9.3, 2017, pp. 198–208.Search in Google Scholar
Liu, L., et al. “Learning by analogy: Reliable supervision from transformations for unsupervised optical flow estimation.” in Proc. of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 6489-6498.Search in Google Scholar
Jonschkowski, R., et al., “What matters in unsupervised optical flow”, in Proc. Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, Aug. 23–28, 2020, Proceedings, Part II 16. Springer International Publishing, 2020, pp. 557–572.Search in Google Scholar
Mumuni, F, and Mumuni, A., “Bayesian cue integration of structure from motion and CNN-based monocular depth estimation for autonomous robot navigation”, International Journal of Intelligent Robotics and Applications, 6.2, 2022, pp. 191–206.Search in Google Scholar
Lee, D.-J., et al., “See and avoidance behaviors for autonomous navigation”, Mobile Robots Xvii, vol. 5609, SPIE, 2004, pp. 23–34.Search in Google Scholar
McGuire, K., et al., “Efficient optical flow and stereo vision for velocity estimation and obstacle avoidance on an autonomous pocket drone”, IEEE Robotics and Automation Letters, 2.2, 2017, pp. 1070–1076.Search in Google Scholar
Farid, K., “Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems”, Journal of Field Robotics, 29.2, 2012, pp. 315–378.Search in Google Scholar
Meneses, M. C., Matos, L. N., and Prado, B. O., “Low-cost Autonomous Navigation System Based on Optical Flow Classification”, arXiv preprint arXiv:1803.03966 (2018).Search in Google Scholar
Zhang, J., et al. “Monocular visual navigation of an autonomous vehicle in natural scene corridor-like environments”, in Proc. 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2012, pp. 3659–3666.Search in Google Scholar
Rashed, H., et al., “Motion and depth augmented semantic segmentation for autonomous navigation”, in Proc. of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 2019, pp. 364–370.Search in Google Scholar
Huang, Y., et al., “Learning optical flow with R-CNN for visual odometry”, IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 14410–1441.Search in Google Scholar