0
0
Pcb-designHow-ToIntermediate · 4 min read

Drone Follow Me Using Computer Vision: Simple Guide

To make a drone follow you using computer vision, use a camera to detect your position with object detection or pose estimation. Then, send movement commands to the drone to keep it centered on you using control loops.
📐

Syntax

The basic steps to implement drone follow-me with computer vision are:

  • Capture video frames from the drone's camera.
  • Detect the target (person) using a model like YOLO or OpenPose.
  • Calculate position of the target in the frame (center coordinates).
  • Compute movement commands to keep the target centered and at a safe distance.
  • Send commands to the drone's flight controller.

Each step uses specific functions or libraries depending on your drone SDK and vision tools.

python
import cv2
from djitellopy import Tello

# Connect to drone
drone = Tello()
drone.connect()
drone.streamon()

while True:
    frame = drone.get_frame_read().frame
    # Detect person (placeholder)
    # x, y = detect_person(frame)
    # Calculate movement
    # drone.send_rc_control(left_right_velocity, forward_backward_velocity, up_down_velocity, yaw_velocity)
    cv2.imshow('Drone Camera', frame)
    if cv2.waitKey(1) & 0xFF == ord('q'):
        break

drone.land()
cv2.destroyAllWindows()
💻

Example

This example shows a simple drone follow-me using OpenCV to detect a colored object (like a red shirt) and commands the drone to move left or right to keep the object centered.

python
import cv2
from djitellopy import Tello

# Initialize drone
drone = Tello()
drone.connect()
drone.streamon()

def find_red_object(frame):
    hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
    lower_red = (0, 120, 70)
    upper_red = (10, 255, 255)
    mask1 = cv2.inRange(hsv, lower_red, upper_red)
    lower_red2 = (170, 120, 70)
    upper_red2 = (180, 255, 255)
    mask2 = cv2.inRange(hsv, lower_red2, upper_red2)
    mask = mask1 | mask2
    contours, _ = cv2.findContours(mask, cv2.RETR_TREE, cv2.CHAIN_APPROX_SIMPLE)
    if contours:
        largest = max(contours, key=cv2.contourArea)
        if cv2.contourArea(largest) > 500:
            x, y, w, h = cv2.boundingRect(largest)
            return x + w // 2, y + h // 2
    return None

try:
    while True:
        frame = drone.get_frame_read().frame
        center = find_red_object(frame)
        if center:
            frame_center = frame.shape[1] // 2
            error = center[0] - frame_center
            if abs(error) > 20:
                velocity = int(error / 10)
                velocity = max(min(velocity, 20), -20)
                drone.send_rc_control(velocity, 0, 0, 0)  # left/right
            else:
                drone.send_rc_control(0, 0, 0, 0)
            cv2.circle(frame, center, 10, (0, 255, 0), 2)
        else:
            drone.send_rc_control(0, 0, 0, 0)
        cv2.imshow('Follow Me', frame)
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break
finally:
    drone.send_rc_control(0, 0, 0, 0)
    drone.land()
    cv2.destroyAllWindows()
⚠️

Common Pitfalls

  • Poor lighting or background: Can cause the vision model to fail detecting the target.
  • Incorrect thresholds: Wrong color or detection thresholds lead to no or false detections.
  • Latency: Slow processing causes delayed drone response, making follow unstable.
  • Unsafe distances: Not controlling drone distance can cause collisions.
  • Ignoring drone limits: Sending commands beyond drone speed or range causes errors.

Always test in a safe, open area and tune detection parameters carefully.

python
## Wrong way: sending fixed velocity without checking detection
# drone.send_rc_control(30, 0, 0, 0)  # Too fast, no control

## Right way: adjust velocity based on error and limit speed
#error = target_x - frame_center
#velocity = max(min(int(error / 10), 20), -20)
#drone.send_rc_control(velocity, 0, 0, 0)
📊

Quick Reference

  • Capture frames: Use drone SDK video stream.
  • Detect target: Use color detection, object detection, or pose estimation.
  • Calculate error: Find difference between target center and frame center.
  • Control drone: Send velocity commands proportional to error.
  • Safety: Keep safe distance and test in open space.

Key Takeaways

Use computer vision to detect and track the target's position in the camera frame.
Calculate the difference between the target's position and the frame center to guide drone movement.
Send proportional control commands to the drone to keep it following smoothly.
Test in safe environments and tune detection parameters for reliability.
Avoid fixed speed commands; always adjust velocity based on real-time detection.