1. MUR-ARTAL R, TARDÓS J D. ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Transactions on Robotics, 2017, 33(5): 1255-1262.
2. ENGEL J, KOLTUN V, CREMERS D. Direct sparse odometry. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, 40(3): 611-625.
3. FORSTER C, PIZZOLI M, SCARAMUZZA D. SVO: Fast semi-direct monocular visual odometry. Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA’14), 2014, May 31-Jun 7, Hong Kong, China. Piscataway, NJ, USA: IEEE, 2014: 15-22.
4. CAMPOS C, ELVIRA R, RODRÍGUEZ J J G, et al. ORB-SLAM3: An accurate open-source library for visual, visual-inertial, and multimap SLAM. IEEE Transactions on Robotics, 2021, 37(6): 1874-1890.
5. BESCOS B, FÁCIL J M, CIVERA J, et al. DynaSLAM: Tracking, mapping, and inpainting in dynamic scenes. IEEE Robotics and Automation Letters, 2018, 3(4): 4076-4083.
6. YU C, LIU Z X, LIU X J, et al. DS-SLAM: A semantic visual SLAM towards dynamic environments. Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’18), 2018, Oct 1-5, Madrid, Spain. Piscataway, NJ, USA: IEEE, 2018: 1168-1174.
7. ZHANG T W, ZHANG H Y, LI Y, et al. Flowfusion: Dynamic dense RGB-D SLAM based on optical flow. Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA’20), 2020, May 31-Aug 31, Paris, France. Piscataway, NJ, USA: IEEE, 2020: 7322-7328.
8. YUAN X, CHEN S. SaD-SLAM: A visual SLAM based on semantic and depth information. Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’20), 2020, Oct 25-29, Las Vegas, NV, USA. Piscataway, NJ, USA: IEEE, 2020: 4930-4935.
9. LI P L, QIN T, SHEN S J. Stereo vision-based semantic 3D object and ego-motion tracking for autonomous driving. Computer Vision: Proceedings of the 15th European Conference on Computer Vision (ECCV): Part II, 2018, Sept 8-14, Munich, Germany. LNCS 11206. Berlin, Germany: Springer, 2018: 664-679.
10. YANG S C, SCHERER S. CubeSLAM: Monocular 3-D object SLAM. IEEE Transactions on Robotics, 2019, 35(4): 925-938.
11. HUANG J H, YANG S, MU T J, et al. ClusterVO: Clustering moving instances and estimating visual odometry for self and surroundings. Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR'20), 2020, Jun 14-19, Seattle, WA, USA. Piscataway, NJ, USA: IEEE, 2020: 2168-2177.
12. ZHANG J, HENEIN M, MAHONY R, et al. VDO-SLAM: A visual dynamic object-aware SLAM system. arXiv Preprint, arXiv: 2005.11052, 2020.
13. HENEIN M, ZHANG J, MAHONY R, et al. Dynamic SLAM: The need for speed. Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA’20), 2020, May 31-Aug 31, Paris, France. Piscataway, NJ, USA: IEEE, 2020: 2123-2129.
14. BESCOS B, CAMPOS C, TARDÓS J D, et al. DynaSLAM II: Tightly-coupled multi-object tracking and SLAM. IEEE Robotics and Automation Letters, 2021, 6(3): 5191-5198.
15. WANG C C, THORPE C, THRUN S. Online simultaneous localization and mapping with detection and tracking of moving objects: Theory and results from a ground vehicle in crowded urban areas. Proceedings of the 2003 IEEE International Conference on Robotics and Automation (ICRA’03): Vol 1, 2003, Sept 14-19, Taipei, China. Piscataway, NJ, USA: IEEE, 2003: 842-849.
16. WANGSIRIPITAK S, MURRAY D W. Avoiding moving outliers in visual SLAM by tracking moving objects. Proceedings of the 2009 IEEE International Conference on Robotics and Automation (ICRA’09), 2009, May 12-17, Kobe, Japan. Piscataway, NJ, USA: IEEE, 2009: 375-380.
17. ROGERS J G, TREVOR A J B, NIETO-GRANDA C, et al. SLAM with expectation maximization for moveable object tracking. Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’10), 2010, Oct 18-22, Taipei, China. Piscataway, NJ, USA: IEEE, 2010: 2077-2082.
18. BÂRSAN I A, LIU P D, POLLEFEYS M, et al. Robust dense mapping for large-scale dynamic environments. Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA’18), 2018, May 21-25, Brisbane, Australia. Piscataway, NJ, USA: IEEE, 2018: 7510-7517.
19. ROSINOL A, GUPTA A, ABATE M, et al. 3D dynamic scene graphs: Actionable spatial perception with places, objects, and humans. arXiv Preprint, arXiv: 2002.06289, 2020.
20. GEIGER A, LENZ P, STILLER C, et al. Vision meets robotics: The KITTI dataset. International Journal of Robotics Research, 2013, 32(11): 1231-1237.
21. WANG C C, THORPE C, THRUN S, et al. Simultaneous localization, mapping and moving object tracking. International Journal of Robotics Research, 2007, 26(9): 889-916.
22. HENEIN M, KENNEDY G, MAHONY V I R, et al. Exploiting rigid body motion for SLAM in dynamic environments. http://www.gerard-kennedy.com/2018_icra.
23. Eppenberger T, Cesari G, Dymczyk M, et al. Leveraging stereo-camera data for real-time dynamic obstacle detection and tracking[C]//2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2020: 10528-10535.
24. HIRSCHMULLER H. Stereo processing by semiglobal matching and mutual information. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 30(2): 328-341.
25. GEIGER A, ROSER M, URTASUN R. Efficient large-scale stereo matching. Computer Vision: Proceedings of the 10th Asian Conference on Computer vision (ACCV’10): Part I, 2010, Nov 8-12, Queenstown, New Zealand. LNIP 6492. Berlin, Germany: Springer, 2010: 25-38.
26. REDMON J, FARHADI A. YOLOv3: An incremental improvement. arXiv Preprint, arXiv:1804.02767.
27. MEINHARDT-LLOPIS E, SÁNCHEZ J. Horn-schunck optical flow with a multi-scale strategy. Image Processing On Line, 2012-06-13.
|