The Journal of China Universities of Posts and Telecommunications ›› 2023, Vol. 30 ›› Issue (4): 67-74.doi: 10.19682/j.cnki.1005-8885.2023.2017

• Wireless • Previous Articles     Next Articles

Design and implementation of unmanned aerial vehicle localization and light tracking system

Xu Ming, Dong Chen, Li Hanlu, Luo Qiming, Wang Lizi, Zhang Wanru, Liu Huixin   

  1. 1. School of Digital Media and Design Arts, Beijing University of Posts and Telecommunications, Beijing 100876, China 2. School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing 100876, China 3. State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China 4. School of Computer Science (National Pilot Software Engineering School), Beijing University of Posts and Telecommunications, Beijing 100876, China
  • Received:2023-02-17 Revised:2023-06-23 Accepted:2023-08-31 Online:2023-08-31 Published:2023-08-31
  • Contact: Dong Chen, E-mail: dongchen@bupt.edu.cn E-mail:dongchen@bupt.edu.cn

Abstract: Unmanned aerial vehicles (UAV) are applied widely and profoundly in various fields. Moreover, high-precision positioning and tracking in multiple scenarios are the core requirements for UAV usage. To ensure stable communication of UAVs in denial environments with substantial electromagnetic interference, a systematic solution is proposed based on a deep learning algorithm for target detection and visible light for UAV tracking. Considering the cost and computational power limitations on the hardware, the you only look once (YOLO) v4-Tiny model is used for static target detection of the UAV model. For UAV tracking, and a light tracker that can adjust the angle of emitted light and focus it on the target is used for dynamic tracking processing. Thus, achieving the primary conditions of UAV optical communication with good secrecy is also suitable for dynamic situations. The UAV tracker positions the UAV model by returning the coordinates and calculating the time delay, and then controls the spotlight to target the UAV. In order to facilitate the deployment of deep learning models on hardware devices, the lighter and more efficient model is selected after comparison. The trained model can achieve 99.25% accuracy on the test set. The dynamic target detection can reach 20 frames per second (FPS) on a computer with an MX520 graphics processing unit (GPU) and 6 GB of random access memory (RAM). Dynamic target detection on a Jetson Nano can reach 5.4 FPS.

Key words: unmanned aerial vehicles, deep learning, light tracking, target detection, unmanned aerial vehicles location, unmanned aerial vehicles tracking