← Go Back

Eyes in the Sky: Using MAVSDK to Build a Drone That Tracks Humans

PURA UAV Team | Publication Date: May 2025

Drones are evolving — no longer just flying cameras, they’re becoming intelligent systems that can react to dynamic environments in real time. One of the most exciting developments in this area is human detection and tracking, and in our current UAV project, we’ve built just that — a drone that autonomously follows a person, powered by MAVSDK.

In this article, I’ll walk you through how we use MAVSDK to interface with the drone’s autopilot and build real-time, AI-driven behavior from the ground up.

What is MAVSDK?

MAVSDK is an easy-to-use set of APIs (Python, C++, Java, Swift) for communicating with drones that use the PX4 autopilot via the MAVLink protocol. Unlike the low-level complexity of MAVLink, MAVSDK abstracts commands into clean, modern functions — perfect for rapid UAV software development.

We use MAVSDK-Python to control our drone programmatically — issuing takeoff, landing, velocity commands, and monitoring telemetry.

Project Goal: A Drone That Recognizes and Tracks a Human

The mission: build a drone that autonomously follows a specific person, using real-time video input and onboard processing. No joysticks. No ground control station. Just smart code, a camera, and MAVSDK.

System Architecture

Hardware

  • Drone Platform: PX4-compatible quadrotor
  • Flight Controller: Pixhawk 4
  • Companion Computer: Jetson Nano or Raspberry Pi 4
  • Camera: USB or CSI module
  • Telemetry Link: UDP/Wi-Fi

Software Stack

  • Computer Vision: YOLOv8
  • Control Interface: MAVSDK-Python
  • Command Type: Offboard velocity control
  • Failsafes: RTL on timeout or low battery

Step-by-Step: How It Works

  1. Frame Capture & Inference: The companion computer captures frames and runs YOLOv8 to detect people.
  2. Target Tracking: Bounding box offset guides drone motion to keep the target centered.
  3. MAVSDK Commands: Offboard velocity updates are sent in real-time using MAVSDK.
  4. Failsafe Handling: If the human is lost or battery is low, the drone safely lands or returns home.

Why We Chose MAVSDK

Code Snapshot

await drone.offboard.set_velocity_body(
    VelocityBodyYawspeed(forward, right, 0.0, yaw_rate)
)
await drone.offboard.start()

We calculate forward, right, and yaw_rate based on the person’s position in the frame.

Challenges We Faced

Testing and Simulation

We first validated behavior using PX4 SITL and Gazebo, then transitioned to real-world tests with nearly identical results — thanks to MAVSDK’s consistency.

What’s Next?

Final Thoughts

MAVSDK made it incredibly fast and reliable to develop a real-time, human-tracking drone. Its high-level control, clean APIs, and native support for PX4 gave us exactly the tools we needed — without unnecessary complexity.

If you’re building vision-based drone applications or want full software control over flight behavior, MAVSDK is the framework to start with.