Skip to content

Latest commit

 

History

History
294 lines (215 loc) · 7.56 KB

File metadata and controls

294 lines (215 loc) · 7.56 KB

NullClaw Railway Template

🚀 Deploy AI agents with multiple LLM providers and communication channels in minutes

A Docker-based deployment template for NullClaw - the AI agent framework that works with Claude, GPT, Llama, and more, accessible via Telegram, Discord, Slack, CLI, and other channels.

✨ Features

  • Multi-Provider Support: OpenRouter, Anthropic, OpenAI, Groq, xAI, DeepSeek, Mistral, Ollama
  • Multiple Channels: Telegram, Discord, Slack, IRC, CLI, and more
  • One-Command Deploy: Deploy to Railway.app or run locally with Docker
  • Environment-Driven: Configure everything via environment variables
  • Production-Ready: Health checks, auto-restart, and audit logging
  • Secure: Built-in sandboxing and workspace isolation

🚀 Quick Start

Prerequisites

  • Docker and Docker Compose
  • At least one LLM provider API key (OpenRouter recommended)

1. Clone and Configure

git clone https://github.com/your-username/nullclaw-railway-template.git
cd nullclaw-railway-template

# Copy environment template
cp .env.example .env

# Edit with your API keys
nano .env

2. Set API Key

# In .env file - at least one provider required
OPENROUTER_API_KEY=sk-or-v1-xxxxx
# or
ANTHROPIC_API_KEY=sk-ant-xxxxx
# or
OPENAI_API_KEY=sk-xxxxx

3. Run

# Build and start gateway service
docker compose up -d nullclaw

# Check logs
docker compose logs -f nullclaw

# Test health endpoint
curl http://localhost:3000/health

Your AI agent is now running at http://localhost:3000! 🎉

📖 Usage

Gateway Mode (HTTP API)

# Start gateway on port 3000
docker compose up -d nullclaw

# Access the API
curl http://localhost:3000/health

Agent Mode (CLI)

# Start interactive CLI agent
docker compose --profile agent up -d nullclaw-agent

# View agent logs
docker compose logs -f nullclaw-agent

With Telegram Bot

# Add to .env
TELEGRAM_BOT_TOKEN=123456:ABCDEF
TELEGRAM_ALLOW_FROM=123456789  # Your Telegram user ID

# Restart service
docker compose restart nullclaw

⚙️ Configuration

Required Environment Variables

# At least one provider API key
OPENROUTER_API_KEY=     # Recommended - supports 100+ models
ANTHROPIC_API_KEY=      # Claude models
OPENAI_API_KEY=         # GPT models
GROQ_API_KEY=           # Fast inference

Optional Configuration

# Model Selection
DEFAULT_MODEL=openrouter/anthropic/claude-sonnet-4

# Port (Railway convention)
PORT=3000

# Host binding (optional, defaults to 0.0.0.0)
# GATEWAY_HOST=0.0.0.0

# Autonomy & Security
AUTONOMY_LEVEL=supervised        # supervised, semi_autonomous, full
WORKSPACE_ONLY=true
MAX_ACTIONS_PER_HOUR=20
SANDBOX_BACKEND=auto             # auto, landlock, firejail, bubblewrap, docker, none
AUDIT_ENABLED=true

# Memory Backend
MEMORY_BACKEND=sqlite            # sqlite, markdown

Communication Channels

Telegram

TELEGRAM_BOT_TOKEN=123456:ABCDEF
TELEGRAM_ALLOW_FROM=123456789    # Comma-separated user IDs or "*"

Discord

DISCORD_TOKEN=your-bot-token
DISCORD_GUILD_ID=123456789
DISCORD_ALLOW_FROM=123456789

Slack

SLACK_BOT_TOKEN=xoxb-xxxxx
SLACK_APP_TOKEN=xapp-xxxxx
SLACK_ALLOW_FROM=U123456

See GUIDE_ADD_NEW_CHANNELS.md for detailed instructions.

🏗️ Supported Providers

Provider API Key Variable Popular Models
OpenRouter OPENROUTER_API_KEY claude-sonnet-4, gpt-4o, llama-3.3-70b
Anthropic ANTHROPIC_API_KEY claude-3-5-sonnet, claude-3-opus
OpenAI OPENAI_API_KEY gpt-4o, gpt-4-turbo, o1-preview
Groq GROQ_API_KEY llama-3.3-70b-versatile, mixtral-8x7b
xAI XAI_API_KEY grok-beta
DeepSeek DEEPSEEK_API_KEY deepseek-chat, deepseek-coder
Mistral MISTRAL_API_KEY mistral-large-latest, codestral-latest
Ollama OLLAMA_API_KEY (base URL: http://localhost:11434)

🚢 Deployment

Railway.app (Recommended)

  1. Push this template to GitHub
  2. Connect repository to Railway
  3. Set environment variables in Railway dashboard
  4. Deploy! Railway will use railway.json configuration

Configuration:

  • Builder: Dockerfile
  • Replicas: 1
  • Restart: ON_FAILURE (max 10 retries)
  • Health check: GET /health every 30s

Manual Docker

# Build image
docker compose build nullclaw

# Run with environment file
docker compose up -d nullclaw

# Or with inline environment variables
docker run -d \
  -p 3000:3000 \
  -e OPENROUTER_API_KEY=sk-or-v1-xxxxx \
  -e DEFAULT_MODEL=openrouter/anthropic/claude-sonnet-4 \
  nullclaw-railway:latest

🔧 Development

Project Structure

nullclaw-railway-template/
├── Dockerfile                 # Multi-stage build (Zig compilation)
├── docker-compose.yml         # Service orchestration
├── entrypoint.sh              # Config generation script
├── config.template.json       # Configuration template
├── .env.example               # Environment variable reference
├── railway.json               # Railway.app deployment config
├── GUIDE_ADD_NEW_CHANNELS.md  # Channel extension guide
├── AGENTS.md                  # AI agent guidelines
└── README.md                  # This file

How It Works

  1. Build Time: Dockerfile clones NullClaw from GitHub and compiles with Zig
  2. Runtime: entrypoint.sh generates config.json from environment variables
  3. Startup: NullClaw binary starts with generated configuration
  4. Health Check: HTTP endpoint at /health monitors service status

Adding New Channels

  1. Check NullClaw's config.example.json
  2. Add environment variables to .env.example
  3. Update build_channels() in entrypoint.sh
  4. Test with docker compose up -d nullclaw

See GUIDE_ADD_NEW_CHANNELS.md for detailed examples.

🐛 Troubleshooting

Container won't start

# Check logs
docker compose logs nullclaw

# Verify environment variables
docker compose exec nullclaw env | grep API_KEY

# Check generated config
docker compose exec nullclaw cat /nullclaw-data/.nullclaw/config.json

Health check failing

# Test endpoint manually
curl -v http://localhost:3000/health

# Check if port is in use
lsof -i :3000

Configuration not applied

# Rebuild container
docker compose down
docker compose build nullclaw
docker compose up -d nullclaw

📚 Resources

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

📝 License

This project is open source and available under the MIT License.

🙏 Acknowledgments

  • NullClaw - The AI agent framework
  • Railway.app - Simplified deployment platform
  • All the amazing LLM providers making AI accessible

Made with ❤️ for the AI community

⬆ Back to Top