A proactive suggestion engine that learns from your real ride and food order history, then surfaces recommendations before you ask. Connects to Uber (rides) and Swiggy (food) via Playwright browser automation to fetch actual user data, live pricing, and ETAs.
| Layer | Technology |
|---|---|
| Frontend | React, Framer Motion, Tailwind CSS |
| Backend | Next.js 14, Node.js 22+, SQLite (node:sqlite), SSE |
| AI/ML | Python, FastAPI, NumPy, Claude Haiku |
| Automation | Playwright (Chrome channel) |
| Traffic | OpenRouteService API + fallback |
- Node.js >= 22 (required for built-in
node:sqlite) - Python >= 3.11
- Google Chrome installed (Playwright uses it via
channel: "chrome"to avoid bot detection)
# Node
npm install
npx playwright install chromium
# Python
cd python && pip install -r requirements.txt && cd ..Create .env.local in the project root:
# Required for live traffic data (free, 2000 req/day)
OPENROUTESERVICE_API_KEY=your_key_here
# Required for LLM-generated reasoning in suggestion cards
ANTHROPIC_API_KEY=your_key_hereBoth keys are optional — the system falls back gracefully if either is missing.
# Terminal 1 — Python ML service (port 8000)
cd python && uvicorn main:app --port 8000 --reload
# Terminal 2 — Next.js app (port 3000)
npm run devOpen http://localhost:3000.
The app uses real user data — no seeded/fake data. You must connect at least one platform.
# 1. Launch login — opens a Chrome window
curl -X POST http://localhost:3000/api/uber/auth
# Log in with your Uber credentials. You have 10 minutes.
# 2. Import your ride history into the database
curl -X POST http://localhost:3000/api/uber/import-trips
# 3. Check status
curl http://localhost:3000/api/uber/status# 1. Launch login — opens a Chrome window
curl -X POST http://localhost:3000/api/swiggy/auth
# Enter phone number + OTP. You have 10 minutes.
# 2. Import your order history into the database
curl -X POST http://localhost:3000/api/swiggy/import-orders
# 3. Check status
curl http://localhost:3000/api/swiggy/statusSessions persist across restarts — you only need to log in once (until cookies expire).
Real ride/food history is scraped from your Uber and Swiggy accounts via Playwright. The pattern engine uses recency-weighted scoring:
age_days = (now - event.timestamp_ms) / 86_400_000
recency_w = exp(-0.05 * age_days) # half-life ~ 14 days
time_w = exp(-0.5 * (Δmin / 20)^2) # Gaussian σ = 20 min
score = recency_w * time_w
confidence = min(Σ scores / 1.5, 1.0)The system learns: frequent destinations, ordering times, preferred platforms, cuisine preferences — all from your actual behavior.
Every 30 seconds, the trigger loop evaluates:
- Ride triggers: time-of-day patterns, calendar schedules, traffic deviations
- Food triggers: meal window detection, delivery delay alerts
- Confidence thresholds: rides >= 0.55, food >= 0.65
- Calendar signal present: bypasses threshold
Once connected, the trigger loop fetches real data:
- Uber: live surge pricing, ETAs from Uber's web app
- Swiggy: live delivery ETAs, menu prices, restaurant availability
- 3-5 minute caching to avoid rate limits
- Graceful fallback to estimates if scraping fails
Each suggestion includes a natural-language explanation:
"You usually order biryani on Friday evenings — delivery is 15 min slower than usual, ordering now keeps you on schedule"
Falls back to template strings if ANTHROPIC_API_KEY is not set.
| Last action | Cooldown |
|---|---|
| Confirmed | 4 hours |
| Dismissed | 30 minutes |
| Ignored | 15 minutes |
Plus: one active suggestion per type, "already done today" guard, dismissal feedback loop (3+ dismissals in 7 days reduces confidence by 0.15).
User connects Uber/Swiggy
→ Playwright (Chrome) scrapes real history
→ Stored in SQLite (tagged data_source: "scraped")
→ Pattern engine learns from real behavior
Every 30s: trigger loop
→ getRideEvents() / getFoodEvents() ← SQLite
→ POST /analyze ← Python ML service (NumPy)
→ If confidence > threshold:
→ Fetch live pricing/ETA ← Playwright scrape + cache
→ POST /reasoning ← Python (Claude Haiku)
→ Insert suggestion to DB
→ SSE broadcast to all clients
Browser
→ SSE /api/suggestions/stream
→ SuggestionFeed (React + Framer Motion)
→ User confirms/edits/dismisses
→ Feedback stored → improves future suggestions
| Method | Path | Description |
|---|---|---|
GET |
/api/uber/auth |
Check Uber session status |
POST |
/api/uber/auth |
Launch Chrome for Uber login |
POST |
/api/uber/import-trips |
Scrape and import ride history |
GET |
/api/uber/status |
Uber integration overview |
GET |
/api/swiggy/auth |
Check Swiggy session status |
POST |
/api/swiggy/auth |
Launch Chrome for Swiggy login |
POST |
/api/swiggy/import-orders |
Scrape and import order history |
GET |
/api/swiggy/status |
Swiggy integration overview |
| Method | Path | Description |
|---|---|---|
GET |
/api/suggestions/stream |
SSE stream — emits suggestion events |
POST |
/api/confirm |
Confirm suggestion (4h cooldown) |
POST |
/api/dismiss |
Dismiss suggestion (30min cooldown) |
POST |
/api/edit |
Edit suggestion payload |
GET |
/api/history/rides |
Recent ride history |
GET |
/api/history/food |
Recent food order history |
POST |
/api/debug/scenario |
Switch demo scenario |
| Method | Path | Description |
|---|---|---|
GET |
/health |
Health check |
POST |
/analyze |
Pattern scoring (confidence, match_count) |
POST |
/reasoning |
LLM reasoning generation |
├── python/ ← AI/ML service
│ ├── main.py # FastAPI app
│ ├── pattern_engine.py # NumPy recency-weighted scorer
│ ├── reasoning.py # Claude Haiku reasoning
│ ├── models.py # Pydantic schemas
│ └── requirements.txt
├── app/
│ ├── page.tsx # Dashboard
│ └── api/
│ ├── uber/ # Auth, import-trips, status
│ ├── swiggy/ # Auth, import-orders, status
│ ├── suggestions/stream/ # SSE endpoint
│ ├── confirm/ dismiss/ edit/
│ ├── history/rides/ food/
│ └── debug/scenario/ # Demo time override
├── components/ # React UI components
│ ├── SuggestionFeed.tsx # SSE subscriber
│ ├── RideSuggestionCard.tsx # Blue ride card
│ ├── FoodSuggestionCard.tsx # Orange food card
│ ├── PlatformComparison.tsx # Price/ETA comparison grid
│ ├── SuggestionReasoning.tsx # "Why" explanation block
│ ├── RideEditDrawer.tsx # Edit ride suggestion
│ ├── FoodEditDrawer.tsx # Edit food suggestion
│ ├── ConfirmedState.tsx # Animated checkmark
│ ├── DemoScenarioSwitcher.tsx # Sidebar — platform connection + status
│ └── EmptyState.tsx # Idle state (prompts to connect)
├── lib/
│ ├── scraper/ # Playwright automation
│ │ ├── browserManager.ts # Uber browser + session persistence
│ │ ├── swiggyAuth.ts # Swiggy browser + session persistence
│ │ ├── uberTripScraper.ts # Scrape ride history
│ │ ├── uberTripMapper.ts # Map to DB schema
│ │ ├── uberPricingScraper.ts # Live surge/ETA
│ │ ├── swiggyOrderScraper.ts # Scrape order history
│ │ ├── swiggyOrderMapper.ts # Map to DB schema
│ │ └── swiggyMenuScraper.ts # Live menu/ETA/prices
│ ├── live/
│ │ ├── pricing.ts # Unified ride pricing (live Uber + mock others)
│ │ ├── foodPricing.ts # Unified food pricing (live Swiggy + mock Zomato)
│ │ ├── pricingMock.ts # Fallback mock pricing
│ │ ├── pricingCache.ts # Ride pricing cache (3 min TTL)
│ │ ├── foodPricingCache.ts # Food pricing cache (5 min TTL)
│ │ └── trafficClient.ts # ORS API + cache + mock fallback
│ ├── db/
│ │ ├── client.ts # SQLite singleton (WAL mode)
│ │ ├── schema.ts # Migrations (includes scraping columns)
│ │ └── queries/ # rides, food, suggestions, feedback
│ ├── ai/
│ │ └── mlServiceClient.ts # HTTP client to Python service
│ ├── analysis/ # TypeScript fallback analyzers
│ ├── triggers/
│ │ ├── triggerLoop.ts # 30s interval
│ │ ├── rideTrigger.ts # Ride evaluation (uses live pricing)
│ │ ├── foodTrigger.ts # Food evaluation (uses live pricing)
│ │ ├── cooldownManager.ts
│ │ └── suggestionQueue.ts # SSE pub/sub
│ ├── seed/ # Persona patterns (used by analyzers)
│ └── utils/
├── server/startup.ts # DB init → platform check → trigger loop
├── data/ # SQLite DB + session files (gitignored)
└── instrumentation.ts # Next.js hook → startup
- Real data only. No seeded/fake data. The app starts with an empty database — connect Uber/Swiggy to populate.
- Single process. SSE pub/sub and trigger loop use in-memory state. Works with
npm run dev/next start, not serverless. - Chrome required. Playwright uses your installed Chrome (
channel: "chrome") to bypass bot detection on Swiggy/Uber. - Sessions persist. Login cookies are stored in
data/as JSON. Re-login only needed when cookies expire. - Scraping is fragile. Uber/Swiggy DOM changes can break scrapers. Selectors are isolated for easy updates. Live pricing falls back to estimates.
- No actual bookings. Confirm/edit actions are stored in the feedback table only.
- LLM is optional. Without
ANTHROPIC_API_KEY, reasoning falls back to template strings. - Traffic is mock by default. Real data requires an ORS key.