GNOME

Year
2024
Skills
CAD / CAM
Programming
Controls
Intro
GNOME, or Garden Nurturing Organism with Mechanical Efficiency, is an autonomous gardening robot developed at the 2024 Stanford hackathon, "Treehacks". Click here for more...

Process

Inspiration:

Our inspiration for creating an autonomous agricultural robot stemmed from the need for more efficient and precise agricultural practices in an industry that does not have enough resources. Specifically, plant monitoring is important to maximize crop yield and prevent plant viruses. Automating the plant monitoring process will allow farmers to dedicate their time to other tasks while increasing the efficiency of plant monitoring operations. Companies like John Deere, have already invested in automated machinery to increase the efficiency of their products, highlighting the importance of autonomous robotics solutions in the farming industry. We aimed to design a robot that is capable of real-time plant monitoring to cut off diseased plants using computer vision.

Hardware:

Our robot arm is equipped with claws at its end to simulate scissors. A green disk is used to simulate the localization of the robot by accurately cutting once it senses the disk is in the center. This behavior is controlled by a laptop camera, which gives the robot arm data to process. It is also equipped with a moisture sensor and lights that change color once the sensor detects a change in moisture.

Software:

Our robot employs state-of-the-art computer vision techniques for real-time plant monitoring through image classification and semantic segmentation by building off of the baseline EfficientNetV2 and DeepLabV3 models. It can classify various plant species encountered in its environment and detect plant diseases through identifying marks on leaves. Baseline deep learning models were modified through machine learning techniques like fine-tuning and transfer learning. We optimized the models on an Intel CPU using the Intel Developer Cloud. The hardware consisted of a robot arm designed with CAD, laser cutting, 3D printing, and soldering.

My Contribution:

As them mechanical engineer on the project, I designed and built all articulable mechanisms that the software used to interact with the real world. This included CAD design, primarily in Solidworks, and manufacturing with 3D printers and laser-cutters. I also aided in the wiring and programming of the robot controls (not the deep learning and vision systems).

The devpost can be found here.

No items found.

Result

While we unfortunately did not win any prizes in the hackathon, the team did an incredible job at putting together an impressively versatile and functional robot in 36 hours. It was a blur of action, non-stop work for nearly the entire time with little rest. Each member of the team learned a little bit about every other field, and the entire team learned so much about prototyping and quickly adapting to challenges by squeezing as much functionality out of the product as possible.

No items found.