A Nix flake for AUTOMATIC1111/stable-diffusion-webui providing reproducible deployment with automatic GPU detection and Docker support.
- Reproducible builds via Nix flake
- Automatic GPU detection (NVIDIA CUDA, Apple MPS, CPU fallback)
- Persistent data in
~/sd-webui/ - Docker images (CPU and CUDA variants)
- Python 3.10 (official upstream requirement)
# Run Stable Diffusion WebUI
nix run github:utensils/stable-diffusion-webui-nix
# Run with network access (listen on 0.0.0.0)
nix run github:utensils/stable-diffusion-webui-nix -- --listen
# Run with API enabled
nix run github:utensils/stable-diffusion-webui-nix -- --api# Clone the repository
git clone https://github.com/utensils/stable-diffusion-webui-nix.git
cd stable-diffusion-webui-nix
# Run directly
nix run
# Or build and run
nix build
./result/bin/sd-webuinix develop# Basic usage (localhost only)
nix run
# Listen on all interfaces (for LAN access)
nix run .#listen
# Remote access with authentication (recommended for network access)
nix run .#remote # Default: admin:admin
SD_WEBUI_AUTH=user:pass nix run .#remote # Custom credentials
# Remote access via Gradio public tunnel
nix run .#share
# Enable API
nix run .#api
# Custom port
nix run -- --port 7861
# Open browser automatically
nix run -- --open
# Debug mode
nix run -- --debug
# Combine options
nix run -- --listen --api --port 7861# Build CPU image
nix run .#buildDocker
# Build CUDA image
nix run .#buildDockerCuda
# Run CPU container
docker run -p 7860:7860 -v $PWD/data:/data sd-webui:latest
# Run CUDA container
docker run --gpus all -p 7860:7860 -v $PWD/data:/data sd-webui:cudaUser data is stored in ~/sd-webui/:
~/sd-webui/
├── app/ # SD WebUI source code
├── venv/ # Python virtual environment
├── models/ # Model storage
│ ├── Stable-diffusion/ # Main checkpoints
│ ├── Lora/
│ ├── VAE/
│ ├── hypernetworks/
│ ├── embeddings/
│ └── ...
├── outputs/ # Generated images
├── extensions/ # Installed extensions
└── configs/ # User configurations
Note: The path must NOT contain directories starting with . (dotfiles). Gradio blocks file serving from such paths, causing 403 errors. Override with SD_WEBUI_USER_DIR if needed.
The flake automatically detects your GPU and installs the appropriate PyTorch version:
- NVIDIA GPU: Installs CUDA-enabled PyTorch (configurable via
CUDA_VERSIONenv var) - Apple Silicon: Uses MPS acceleration
- No GPU: Falls back to CPU-only PyTorch
Override the CUDA version with:
CUDA_VERSION=cu121 nix run -- --listenSupported values: cu118, cu121, cu124, cpu
# Enter development shell
nix develop
# Run checks
nix flake check
# Format Nix files
nix fmt
# Lint Python code (ruff)
nix run .#lint
# Check for updates
nix run .#update# Update SD WebUI source
nix flake update sd-webui-src
# Update all inputs
nix flake updateThis repository uses GitHub Actions for continuous integration:
- Docker builds: Automatically builds and publishes CPU and CUDA Docker images to
ghcr.io/utensils/stable-diffusion-webui-nix - Claude Code Review: Automated PR reviews using Claude
- Claude Code: Respond to
@claudementions in issues and PRs
Docker images are tagged with both the version (e.g., 1.10.1) and latest.
This Nix flake is licensed under the MIT License.
Note: Stable Diffusion WebUI itself is licensed under AGPL-3.0.
- AUTOMATIC1111/stable-diffusion-webui - The upstream project
- utensils - Nix packaging