Skip to content

utensils/stable-diffusion-webui-nix

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Stable Diffusion WebUI Nix

A Nix flake for AUTOMATIC1111/stable-diffusion-webui providing reproducible deployment with automatic GPU detection and Docker support.

Features

  • Reproducible builds via Nix flake
  • Automatic GPU detection (NVIDIA CUDA, Apple MPS, CPU fallback)
  • Persistent data in ~/sd-webui/
  • Docker images (CPU and CUDA variants)
  • Python 3.10 (official upstream requirement)

Quick Start

# Run Stable Diffusion WebUI
nix run github:utensils/stable-diffusion-webui-nix

# Run with network access (listen on 0.0.0.0)
nix run github:utensils/stable-diffusion-webui-nix -- --listen

# Run with API enabled
nix run github:utensils/stable-diffusion-webui-nix -- --api

Installation

Using Flakes (Recommended)

# Clone the repository
git clone https://github.com/utensils/stable-diffusion-webui-nix.git
cd stable-diffusion-webui-nix

# Run directly
nix run

# Or build and run
nix build
./result/bin/sd-webui

Development Shell

nix develop

Usage

Command Line Options

# Basic usage (localhost only)
nix run

# Listen on all interfaces (for LAN access)
nix run .#listen

# Remote access with authentication (recommended for network access)
nix run .#remote                           # Default: admin:admin
SD_WEBUI_AUTH=user:pass nix run .#remote   # Custom credentials

# Remote access via Gradio public tunnel
nix run .#share

# Enable API
nix run .#api

# Custom port
nix run -- --port 7861

# Open browser automatically
nix run -- --open

# Debug mode
nix run -- --debug

# Combine options
nix run -- --listen --api --port 7861

Docker

# Build CPU image
nix run .#buildDocker

# Build CUDA image
nix run .#buildDockerCuda

# Run CPU container
docker run -p 7860:7860 -v $PWD/data:/data sd-webui:latest

# Run CUDA container
docker run --gpus all -p 7860:7860 -v $PWD/data:/data sd-webui:cuda

Directory Structure

User data is stored in ~/sd-webui/:

~/sd-webui/
├── app/                    # SD WebUI source code
├── venv/                   # Python virtual environment
├── models/                 # Model storage
│   ├── Stable-diffusion/   # Main checkpoints
│   ├── Lora/
│   ├── VAE/
│   ├── hypernetworks/
│   ├── embeddings/
│   └── ...
├── outputs/                # Generated images
├── extensions/             # Installed extensions
└── configs/                # User configurations

Note: The path must NOT contain directories starting with . (dotfiles). Gradio blocks file serving from such paths, causing 403 errors. Override with SD_WEBUI_USER_DIR if needed.

GPU Support

The flake automatically detects your GPU and installs the appropriate PyTorch version:

  • NVIDIA GPU: Installs CUDA-enabled PyTorch (configurable via CUDA_VERSION env var)
  • Apple Silicon: Uses MPS acceleration
  • No GPU: Falls back to CPU-only PyTorch

CUDA Version

Override the CUDA version with:

CUDA_VERSION=cu121 nix run -- --listen

Supported values: cu118, cu121, cu124, cpu

Development

# Enter development shell
nix develop

# Run checks
nix flake check

# Format Nix files
nix fmt

# Lint Python code (ruff)
nix run .#lint

# Check for updates
nix run .#update

Updating

# Update SD WebUI source
nix flake update sd-webui-src

# Update all inputs
nix flake update

CI/CD

This repository uses GitHub Actions for continuous integration:

  • Docker builds: Automatically builds and publishes CPU and CUDA Docker images to ghcr.io/utensils/stable-diffusion-webui-nix
  • Claude Code Review: Automated PR reviews using Claude
  • Claude Code: Respond to @claude mentions in issues and PRs

Docker images are tagged with both the version (e.g., 1.10.1) and latest.

License

This Nix flake is licensed under the MIT License.

Note: Stable Diffusion WebUI itself is licensed under AGPL-3.0.

Credits

About

Nix flake for AUTOMATIC1111/stable-diffusion-webui with automatic GPU detection and Docker support

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors