Proposal of a Visual Positioning Architecture for Master-Slave Autonomous UAV Applications Conference Paper uri icon

abstract

  • Autonomous UAVs offer advantages in industrial, agriculture, environment inspection, and logistics applications. Sometimes the use of cooperative UAVs is important to solve specific demands or achieve productivity gain in these applications. An important technical challenge is the precise positioning between two or more UAVs in a cooperative task flight. Some techniques provide solutions, like the GNSS positioning, visual and LIDAR slam, and computer vision intelligent algorithms, but all these techniques present limitations that must be solved to work properly in specific environments. The proposal of new cooperative position methods is important to face these challenges. The present work proposes an evaluation of a visual relative positioning architecture between two small UAV multi-rotor aircraft working in a master-slave operation, based on an Augmented Reality tag tool. The simulation results obtained absolute error measurements lower than 0.2 cm mean and 0.01 standard deviation for X, Y and Z directions. Yaw measurements presented an absolute error lower than 0.5 ∘ C with a 0.02–5 ∘ C standard deviation. The real-world experiments executing autonomous flight with the slave UAV commanded by the master UAV achieved success in 8 of 10 experiment rounds, proving that the proposed architecture is a good approach to building cooperative master-slave UAV applications.
  • The authors are grateful to the FCT - Foundation for Science and Technology, Portugal for financial support through national funds FCT/MCTES (PIDDAC) to CeDRI - Research Centre in Digitalization and Intelligent Robotics (UIDB/05757/2020 and UIDP/05757/2020) and SusTEC (LA/P/0007/2021)

publication date

  • January 1, 2023