PiLot: Autonomous Rover for Smarter Agriculture

Autonomous navigation. Real-time insight. Smarter, scalable farming starts here.

Project Overview

PiLot is a cost-effective autonomous Uncrewed Ground Vehicle (UGV) built to help farms tackle repetitive and time-consuming field tasks. It runs GPS-based missions, avoids obstacles using real-time depth sensing, and streams telemetry and video straight to a cloud dashboard. The system is modular, scalable, and designed for the real-world challenges of modern agriculture.

Why This Matters

Farming’s getting harder. Labour is expensive and often unavailable, especially for small and mid-sized farms. Full automation is either too costly or too complex. PiLot offers a practical solution — autonomous mobility, smart sensing, and remote visibility — without needing an enterprise budget.

Key Features

My Role – Systems Architect

System Architecture

Everything runs through a balance of onboard intelligence and cloud connectivity:

System Architecture Diagram

Project Implementation

PiLot is built around a plug-and-play concept — one that merges computer vision, GPS logic, and cloud data tools into a field-ready unit. From sensors to streaming, every part had to talk to each other and work in real-time.

System Integration: A stereo depth camera and GNSS module feed data into an onboard computer, which calculates motion paths and uploads telemetry to the cloud. GPS corrections via an NTRIP service keep positioning accurate and responsive.

Obstacle Detection and Avoidance: The rover checks for obstacles on every frame using stereo depth. When something shows up in its path, it decides on-the-fly how to reroute safely — no need for pre-planned maps or fences.

Autonomous Navigation: The mission starts by loading in a list of GPS waypoints. The rover drives itself through the list, adapting to changes along the way using sensor input and heading correction.

Real-Time Telemetry: As the rover moves, it logs GPS coordinates, heading, and object proximity. This data is pushed to the cloud using REST APIs and is shown instantly on the dashboard.

Live Video Streaming: The onboard camera sends a compressed video feed to the cloud, letting remote viewers monitor the rover in action and plan enhancements for future AI use.

Dashboard Interface: HTML, CSS, and JavaScript were used to build a clean, responsive UI. The dashboard shows location, status, system vitals, and includes the live video. It’s mobile-friendly and ideal for remote use.

Project Outcomes

Lessons Learned

This project pushed me to wear many hats — developer, hardware wrangler, field tester, and UI builder. I learned how to bring together complex systems under real-world conditions and how to design for failure, not just success. And most of all, I saw the value of simplifying tech for actual field use.

Gallery

UGV Field Test

Contact

For questions or collaboration, reach me at info@yogeashnehra.com or connect on LinkedIn.