AI code reviews that understand your entire codebase, not just the diff.
Verix is an open-source GitHub bot that reviews pull requests using an agentic AI system. Instead of blindly reviewing a diff, it explores your dependency graph to understand how changes affect the rest of your code — then posts inline suggestions directly on the PR.
- You open a pull request
- Verix reads the diff and uses the dependency graph to find related files
- An AI agent explores your codebase through tool calls — fetching imports, checking dependents, reading source files
- It posts inline review comments with actionable suggestions
PR changes auth.ts
→ Agent checks: what does auth.ts import? (helpers.ts)
→ Agent checks: what depends on auth.ts? (middleware.ts)
→ Agent reads helpers.ts, sees sanitizeInput() only strips HTML, not SQL
→ Posts: "Critical: SQL injection — sanitizeInput doesn't handle SQL escaping"
- Agentic review — AI decides what context it needs, not a fixed crawl
- Dependency-aware — understands how files connect via imports (TypeScript, JavaScript, Python, Go, Rust, Java, Ruby)
- Inline suggestions — posts directly on PR lines with apply-ready fixes
- Pluggable models — Gemini, Claude, OpenAI, or local via Ollama
- Re-review on demand — comment
/verix reviewon any PR to trigger a fresh review - Configurable — drop a
VERIX.mdin your repo to set team review rules - BYOK — bring your own API key, encrypted at rest
- Self-hostable — Docker image, bring your own database and model
There are two ways to use Verix:
- Go to verix.in and sign in with GitHub
- Install the Verix GitHub App on your repos
- Go to Settings — paste your API key and set your model provider
- Optionally configure review rules
- Open a PR — Verix reviews it automatically
No setup, no servers, no Docker. Your API key is encrypted at rest.
Run Verix on your own infrastructure. Full control over your data.
Each self-hosted instance needs its own GitHub App — this is how GitHub authenticates webhooks and API access for your repos.
Go to github.com/settings/apps/new and create an app with:
Permissions:
- Repository → Contents: Read
- Repository → Pull requests: Read & Write
- Repository → Issues: Read
- Repository → Metadata: Read (mandatory)
Subscribe to events:
- Pull request
- Push
- Issue comment
- Installation
Download the private key (.pem file).
git clone https://github.com/lgsurith/Verix.git
cd Verix
cp .env.example .envEdit .env with your GitHub App credentials and model provider:
GITHUB_APP_ID=your-app-id
GITHUB_WEBHOOK_SECRET=your-webhook-secret
GITHUB_PRIVATE_KEY_PATH=./private-key.pem
DATABASE_URL=postgresql://user:pass@host/dbname?sslmode=require
MODEL_PROVIDER=gemini
GEMINI_API_KEY=your-keydocker compose up -dOr for local development:
pnpm install
pnpm devInstall your GitHub App on your repos. Verix will automatically review new PRs.
Drop a VERIX.md in your repo root to set custom review rules:
# Review Guidelines
We use NestJS with TypeORM.
Always check for N+1 query patterns.
Never use `any` type.
All endpoints must have auth guards.
Don't flag console.log — we use a custom logger that wraps it.Verix reads this on every PR and follows your team's conventions. Falls back to CLAUDE.md, AGENTS.md, or .cursorrules if no VERIX.md is found.
# Model provider override
model: gemini
# Files to ignore during review
ignore:
- "*.test.ts"
- "*.spec.ts"
- "dist/**"
# Minimum severity to report (critical, high, medium, low)
min_severity: medium
# Primary language hint
language: typescript| Provider | Config | Notes |
|---|---|---|
| Gemini | MODEL_PROVIDER=gemini + GEMINI_API_KEY |
Free tier available |
| Claude | MODEL_PROVIDER=claude + ANTHROPIC_API_KEY |
|
| OpenAI | MODEL_PROVIDER=openai + OPENAI_API_KEY |
|
| Ollama | MODEL_PROVIDER=ollama + OLLAMA_URL |
Free, local, no API key |
All providers support agentic mode with tool calling.
- Create a free database at neon.tech
- Set
DATABASE_URLin.env docker compose up -d
Uncomment the Postgres and Ollama services in docker-compose.yml:
services:
verix:
build: .
# ...
postgres:
image: postgres:17-alpine
# ...
ollama:
image: ollama/ollama
# ...Set in .env:
DATABASE_URL=postgresql://verix:verix@postgres:5432/verix
MODEL_PROVIDER=ollama
OLLAMA_URL=http://ollama:11434Zero external dependencies. Everything runs on your infra.
GitHub webhook → Verix server → Agent loop
↓
┌───────┴───────┐
↓ ↓
Dep graph (Neon) AI model
↓ ↓
Related files Review JSON
↓ ↓
└───────┬───────┘
↓
PR inline comments
The agent loop:
- Receives the diff
- Calls
get_imports/get_dependentsto query the dependency graph - Calls
get_file_contentto read related files via GitHub API - Calls
submit_reviewwhen it has enough context
src/
├── index.ts Server + webhook handlers
├── github.ts GitHub API helpers
├── review.ts One-shot review (fallback)
├── config.ts VERIX.md + .verix.yml loader
├── types.ts Shared types
├── adapters/
│ ├── base.ts Adapter interface + factory
│ ├── gemini.ts Gemini (function calling)
│ ├── claude.ts Claude (tool use)
│ ├── openai.ts OpenAI (tool calling)
│ └── ollama.ts Ollama (local models)
├── agent/
│ ├── tools.ts Tool definitions + executor
│ └── loop.ts Agent loop with guardrails
├── crypto.ts AES-256-GCM encryption for BYOK keys
├── db/
│ └── index.ts Neon/Postgres schema + queries
└── indexer/
├── depgraph.ts Import parser + graph builder
└── crawler.ts BFS context crawler
MIT
