The Automatic Garbage Collector is a robotic system using an ESP32 microcontroller for trash detection and collection. It features AI-based object detection, a robotic arm, environmental sensors, and remote control via the Blynk app. Developed by Misbah, Rakib, Mahathir, Noman, and Kazi Neyamul Hasan in 4 months.
The Automatic Garbage Collector is a smart robotic system designed to detect and collect trash autonomously. It integrates AI-based object detection, environmental monitoring, and remote control capabilities. The system uses an ESP32 microcontroller for control, a laptop webcam for AI-based object detection, and various sensors for navigation and monitoring.
- Give in repositorie file
- Introduction
- Features
- Components
- Circuit Diagram
- Installation
- Usage
- Future Work
- Contributing
- License
The Automatic Garbage Collector project is designed to help with waste management by detecting trash, collecting it, and disposing of it autonomously. The system combines a laptop running object detection algorithms with an ESP32 microcontroller that controls the movement and operation of the robotic arm and other components.
This project was developed by a team of four members: Misbah, Rakib, Abir, and Noman, with a project duration of 4 months. Most of the time was spent on developing the hardware part.
- AI-based Object Detection: Uses a laptop's webcam and YOLO algorithm to detect trash.
- Autonomous Movement: Uses an ESP32 microcontroller to navigate towards detected objects.
- Trash Collection: Equipped with a robotic arm to pick up trash and drop it in a designated bin.
- Environmental Monitoring: Monitors temperature, humidity, and smoke levels using sensors.
- Remote Control via Blynk: Control and monitor the system through the Blynk app.
- Failsafe Mechanism: Stops the robot if communication is lost or errors are detected.
The system can detect trash like bottles, tissues, papers, and more. Note that the trash picture dataset is not fully included in this repository. If you want to use this code for your project, you can build your own dataset based on this code. Feel free to update and use this code.
- ESP32 Microcontroller: Controls the motors, sensors, and communication with Blynk.
- Laptop: Used for AI-based object detection using a webcam.
- Webcam: Captures live video feed for object detection.
- DC Motors: Used for the movement of the robot.
- Motor Driver (L298N): Controls the DC motors.
- Robotic Arm with Servo Motors: Picks up detected objects.
- DHT11 Sensor: Measures temperature and humidity.
- Smoke Sensor: Detects the presence of smoke.
- Ultrasonic Sensors: Detects obstacles and measures distance.
- Power Supply: Provides power to the motors and sensors.
(Include a circuit diagram showing connections between the ESP32, motor driver, sensors, laptop, and other components.)
- Install Arduino IDE or PlatformIO for programming the ESP32.
- Set up Python on your laptop.
- Install YOLOv8 or a similar object detection framework.
- Set up the Blynk app on your mobile device.
-
ESP32 Setup:
- Connect the ESP32 to your computer and upload the code using the Arduino IDE.
- Configure the ESP32 for Wi-Fi and Blynk communication.
-
Laptop Setup:
- Set up the Python environment with required libraries (e.g.,
opencv,ultralytics). - Install YOLOv8 or your preferred object detection framework.
- Set up the Python environment with required libraries (e.g.,
-
Connecting Components:
- Follow the circuit diagram to connect the motors, sensors, and other components to the ESP32.
- Connect the laptop to the ESP32 via serial communication.
-
Blynk Setup:
- Set up a project on the Blynk app with buttons to start/stop the robot and display sensor readings.
- Update the Blynk authentication token in the ESP32 code.
- Power on the system and ensure that the ESP32 is connected to Wi-Fi.
- Use the Blynk app to start the garbage collection process.
- The robot will autonomously detect objects using the webcam and move towards them.
- The robotic arm will pick up detected trash and place it in the bin.
- Monitor environmental data (temperature, humidity, smoke) through the Blynk app.
- Enhanced AI Model: Improve object detection accuracy by training a custom YOLO model.
- Advanced Path Planning: Integrate A* or Dijkstra's algorithm for better navigation.
- Metal Detection: Add a metal detection feature to separate metallic waste.
- Voice Commands: Control the robot using voice recognition.
- GPS Tracking: Incorporate GPS for outdoor navigation and geo-fencing.
- Battery Management System: Implement battery status monitoring and low-power warnings.
- Additional Sensors: Add more sensors for better obstacle avoidance and functionality.
Contributions are welcome! Please fork this repository and submit a pull request with your changes.
- Fork the repository.
- Create a new branch:
git checkout -b feature-name
- Make your changes and commit them:
git commit -m "Add new feature" - Push to the branch:
git push origin feature-name
- Open a pull request.
GNU GENERAL PUBLIC LICENSE. See the LICENSE file for details.




