Hi, i searched for it but could not find any issue describing exactly my problem.
When running ollama with docker compose -> llmfit detects ollama
When running llama.cpp with docker compose -> llmfit cannot detect llama.cpp
Ports are default:
- Ollama -> 11434
- llama.cpp -> 8080
If you need anything else, just let me know.
Thanks!