PURA UAV Team | Publication Date: May 2025
Drones are evolving — no longer just flying cameras, they’re becoming intelligent systems that can react to dynamic environments in real time. One of the most exciting developments in this area is human detection and tracking, and in our current UAV project, we’ve built just that — a drone that autonomously follows a person, powered by MAVSDK.
In this article, I’ll walk you through how we use MAVSDK to interface with the drone’s autopilot and build real-time, AI-driven behavior from the ground up.
MAVSDK is an easy-to-use set of APIs (Python, C++, Java, Swift) for communicating with drones that use the PX4 autopilot via the MAVLink protocol. Unlike the low-level complexity of MAVLink, MAVSDK abstracts commands into clean, modern functions — perfect for rapid UAV software development.
We use MAVSDK-Python to control our drone programmatically — issuing takeoff, landing, velocity commands, and monitoring telemetry.
The mission: build a drone that autonomously follows a specific person, using real-time video input and onboard processing. No joysticks. No ground control station. Just smart code, a camera, and MAVSDK.
await drone.offboard.set_velocity_body(
VelocityBodyYawspeed(forward, right, 0.0, yaw_rate)
)
await drone.offboard.start()
We calculate forward, right, and yaw_rate based on the person’s position in the frame.
We first validated behavior using PX4 SITL and Gazebo, then transitioned to real-world tests with nearly identical results — thanks to MAVSDK’s consistency.
MAVSDK made it incredibly fast and reliable to develop a real-time, human-tracking drone. Its high-level control, clean APIs, and native support for PX4 gave us exactly the tools we needed — without unnecessary complexity.
If you’re building vision-based drone applications or want full software control over flight behavior, MAVSDK is the framework to start with.