Skip to content

XynaxDev/yolo-space-object-detection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Detects7

Team: Code Enforcers

mAP50 mAP50-95 epochs Model Backend Frontend Build Language JS Framework


Summary

Detects7 is a web + API demo and research prototype for detecting seven safety-related object classes. It was trained on the Falcon synthetic dataset and deployed with a FastAPI backend and a Vite + React frontend. The project is designed to be simple to run locally and to serve as a foundation for further experiments.

Team

Name GitHub Role
Akash Kumar XynaxDev Project lead — directs model development, tunes experiments and hyperparameters, and manages API integration for deployment.
Lavnish lavn1sh Frontend & integration engineer — builds the web interface and connects the ML outputs to the UI and backend.
Nishtha niishthaaaaaa Model engineer — runs training experiments, prepares and curates the dataset, and iterates on model performance.
Himanshi Himanshi1531 Research lead — surveys literature, recommends improvements, and helps shape experiment design.

App name: Detects7 — Team: Code Enforcers

Table: Code Enforcers — team roles and responsibilities.

Key results

Final metrics taken from ml/exp12/results.csv (final epoch = 150).

Metric Value
Precision 0.91082
Recall 0.67942
mAP@50 0.75223
mAP@50:95 0.60106
Training epochs 150
Best checkpoint ml/exp12/weights/best.pt (also exported in models/)

The evaluate.py script generates evaluation_summary.json and a confusion matrix for an official summary.

Repository layout (important files & folders)

Path What it contains
backend/ FastAPI backend (see backend/app/ for main.py, model_loader.py, utils.py, config.py)
frontend/ Vite + React UI (src/ contains App.jsx, main.jsx, components, styles)
ml/ Training and evaluation code, dataset config and experiments (yolo_params.yaml, train_yolo.py, evaluate.py, predict_user.py, exp12/)
models/ Deployment artifacts (best_model_backup.pt, best_model_backup.onnx)
local_run.py Convenience runner for local development
space/ Python virtual environment (optional)
README.md This file

How to reproduce / common commands

Use the project virtual environment in space/ or create a fresh one.

  1. Activate environment (PowerShell example)
& D:\detects7\space\Scripts\Activate.ps1
pip install -r requirements.txt
cd ml
  1. Train (example using ml/exp12/args.yaml)
python train_yolo.py --data yolo_params.yaml --epochs 150 --imgsz 768 --batch 8
  1. Evaluate (produces evaluation_summary.json)
python evaluate.py
  1. Interactive prediction (image / video)
python predict_user.py
  1. Backend (FastAPI) — example
cd backend
& ..\space\Scripts\Activate.ps1
pip install -r requirements.txt
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
  1. Frontend
cd frontend
npm install
npm run dev
# then open http://localhost:5173

Models & artifacts

Artifact Location
Best checkpoint ml/exp12/weights/best.pt
Last checkpoint ml/exp12/weights/last.pt
Exported deployment copies models/best_model_backup.pt, models/best_model_backup.onnx

Tip: keep large model files out of Git history — use Git LFS or release assets for big binaries.

Design notes & decisions

  • We initially explored RTDETR but switched to Ultralytics YOLOv8 because it gave better detection performance for this dataset.
  • Training specifics (see ml/exp12/args.yaml): AdamW, lr0=0.001, imgsz=768, batch=8, epochs=150. Augmentations and other settings were tuned in the experiment.
  • evaluate.py runs the Ultralytics evaluation suite and produces a confusion matrix and a JSON summary.

🗄️Dataset & acknowledgements


Thank you 💌 — Detects7 team

About

Detects7 is a lightweight object-detection demo built using YOLOv8, featuring a FastAPI backend and a React + Vite frontend. Trained on the Falcon synthetic dataset, it identifies seven safety-related object classes and includes tools for training, evaluation, visualization, and local deployment.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors