Autonomous Tomato Farm Inspection Robot

Complete ROS2-based autonomous navigation system for precision agriculture featuring GPS localization, hybrid topological navigation, real-time video recording, and crop row inspection in Gazebo simulation.

Autonomous Tomato Farm Inspection Robot

A fully autonomous mobile robot system for precision agriculture inspection in tomato farms, featuring GPS-based topological navigation, real-time video recording during crop row inspection, and robust sensor fusion for accurate localization in challenging agricultural environments.

Project Overview

This project implements a complete autonomous navigation framework for agricultural robotics using a Robotnik RB-Summit robot in Gazebo simulation. Unlike traditional metric SLAM approaches that struggle in repetitive crop row environments, this system utilizes a Hybrid Topological Navigation strategy with GPS localization and sensor fusion.

The robot autonomously navigates through a tomato farm with 9 crop rows, selecting target rows, approaching them with GPS-guided navigation, performing precise 90° turns, and conducting video-recorded inspections while positioned in the center of each row.

Watch Demo Video →

Key Features

  • GPS-Based Navigation System: Utilizes GPS + IMU sensor fusion via Extended Kalman Filter (EKF) for precise outdoor localization
  • Hybrid Navigation Engine: Fuses GPS-filtered Odometry for stability and IMU orientation for precision
  • JSON-Based Topological Map: Scalable map structure defining Entrance, Center, and Exit nodes for 9 crop rows
  • NavSat Transform: Converts GPS coordinates (lat/lon) to local Cartesian coordinates for seamless navigation
  • Smart State Machine: Handles complex logic flow: GPS Init → Navigate to Row → 90° Turn → Row Inspection

Video Recording & Data Collection

  • Real-Time MP4 Video Recording: Captures high-quality video in MP4 format using OpenCV and cv_bridge
  • Automated Recording Trigger: Video recording starts automatically before the 90° turn and continues during row entry
  • Optimized Inspection Path: Robot moves half the row length (5.56m) to position itself in the middle of crop rows
  • Named Video Output: Videos saved as row_X_inspection.mp4 for easy identification and processing

Control & Precision

  • Adaptive Turn Control: Implements variable-speed 90° pivot turns with sub-3° accuracy
  • Yaw Correction: Real-time heading correction during forward motion using proportional control
  • Sensor Health Monitoring: Continuous validation of IMU, GPS, and odometry data streams

Visualization & Monitoring

  • Real-Time Navigation Visualizer: Live matplotlib window showing topological map with robot position tracking
  • Interactive Information Display: Shows GPS coordinates, target row, distance, and navigation state
  • Path Visualization: Visual representation of robot path from current position to target entrance

System Architecture

The system consists of three main ROS2 packages:

  1. aoc_tomato_farm: Gazebo simulation world with robot URDF, sensor plugins (IMU, GPS, LiDAR, RGBD Camera)
  2. tomato_navigation: Control unit containing navigation logic, configuration files, and master_navigator node
  3. rb_summit_tools: GPS localization system with EKF fusion and complete navigation pipeline launch files
GPS Sensor → NavSat Transform → Odometry (GPS)
                                      ↓
IMU Sensor  ────────────────→ Robot Localization (EKF) → Filtered Odometry
                                      ↓
                              Master Navigator Node → Velocity Commands

Technologies Used

  • ROS2 Humble/Jazzy - Robot Operating System 2 framework
  • Gazebo Ignition (Fortress/Harmonic) - High-fidelity simulation environment
  • Python - Core programming language for navigation logic
  • robot_localization - EKF-based GPS + IMU sensor fusion
  • OpenCV - Real-time video recording and image processing
  • NavSat Transform - GPS coordinate transformation
  • JSON - Topological map configuration
  • Matplotlib - Real-time visualization interface

Performance Metrics

  • Localization Accuracy: ±0.1m (GPS + EKF fusion)
  • Turn Precision: ±3° (90° pivot turn)
  • Navigation Speed: 0.4-0.8 m/s
  • Video Frame Rate: 30 FPS (real-time MP4 recording)
  • Mission Duration: ~30-60 seconds per row inspection

Implementation Highlights

The master navigator operates through a sophisticated finite state machine:

  1. Initialization: Validates all sensor streams (IMU, GPS, Filtered Odometry) with timeout protection
  2. Task Selection: Randomly selects target row from topological map and calculates distance
  3. Approach: Navigates to row entrance using GPS-fused odometry with real-time yaw correction
  4. Video Recording Start: Initializes MP4 video writer at 30 FPS and begins capturing
  5. 90° Pivot Turn: Performs precise in-place rotation with adaptive speed control
  6. Row Inspection: Advances half row length to center position with continuous video recording
  7. Data Finalization: Stops recording, saves MP4 file, and completes mission

Contributors

Developed in collaboration with kamranilv0 and ulvixz