-
Notifications
You must be signed in to change notification settings - Fork 8.2k
Open
Labels
Description
File: content/manuals/ai/compose/models-and-compose.md
Issue
The "Platform portability" section mentions "cloud providers that support Compose models" and "compatible cloud providers" but doesn't specify which cloud providers actually support this feature or link to any documentation about cloud provider support.
From the file:
Prerequisites
- Docker Compose v2.38 or later
- A platform that supports Compose models such as Docker Model Runner (DMR) or compatible cloud providers.
And later:
Cloud providers
The same Compose file can run on cloud providers that support Compose models:
services: chat-app: image: my-chat-app models: - llm models: llm: model: ai/smollm2 # Cloud-specific configurations x-cloud-options: - "cloud.instance-type=gpu-small" - "cloud.region=us-west-2"Cloud providers might:
- Use managed AI services instead of running models locally
- Apply cloud-specific optimizations and scaling
Suggested fix
Either:
- Specify which cloud providers support Compose models (e.g., "AWS ECS", "Azure Container Instances")
- Link to documentation listing compatible cloud providers
- If no cloud providers currently support this, clarify that this is a future capability or remove the cloud provider examples
Readers trying to use this feature on cloud platforms need to know which platforms actually support it.
Found by nightly documentation freshness scanner
Reactions are currently unavailable