i've got 4 gpu's but output is showing 1 with lmm_checker i'm getting this ...
llm-checker hw-detect
██╗ ██╗ █████╗ ██████╗ ██████╗ ██╗ ██╗ █████╗ ██████╗ ███████╗
██║ ██║██╔══██╗██╔══██╗██╔══██╗██║ ██║██╔══██╗██╔══██╗██╔════╝
███████║███████║██████╔╝██║ ██║██║ █╗ ██║███████║██████╔╝█████╗
██╔══██║██╔══██║██╔══██╗██║ ██║██║███╗██║██╔══██║██╔══██╗██╔══╝
██║ ██║██║ ██║██║ ██║██████╔╝╚███╔███╔╝██║ ██║██║ ██║███████╗
╚═╝ ╚═╝╚═╝ ╚═╝╚═╝ ╚═╝╚═════╝ ╚══╝╚══╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚══════╝
DETECTION
✔ Hardware detected!
=== Hardware Detection ===
Summary:
4x NVIDIA GeForce RTX 4060 Ti (64GB VRAM) + Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
Tier: HIGH
Max model size: 62GB
Best backend: cuda
CPU:
Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
Cores: 36 (18 physical)
SIMD: AVX512
[OK] AVX-512
[OK] AVX2
CUDA:
Driver: 550.163.01
CUDA: 12.4
Total VRAM: 64GB
NVIDIA GeForce RTX 4060 Ti: 16GB
NVIDIA GeForce RTX 4060 Ti: 16GB
NVIDIA GeForce RTX 4060 Ti: 16GB
NVIDIA GeForce RTX 4060 Ti: 16GB
Fingerprint: cuda--rtx-4060-ti-64gb-x4
./rigrank run --model llama3.2 --debug
RigRank Benchmark
⣻ Gathering Telemetry...
✓ System Telemetry Clean
• CPU: Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
• RAM: 257416 MB
• GPU: AD106 [GeForce RTX 4060 Ti]
✓ Ollama Connected
i've got 4 gpu's but output is showing 1 with lmm_checker i'm getting this ...
llm-checker hw-detect
██╗ ██╗ █████╗ ██████╗ ██████╗ ██╗ ██╗ █████╗ ██████╗ ███████╗
██║ ██║██╔══██╗██╔══██╗██╔══██╗██║ ██║██╔══██╗██╔══██╗██╔════╝
███████║███████║██████╔╝██║ ██║██║ █╗ ██║███████║██████╔╝█████╗
██╔══██║██╔══██║██╔══██╗██║ ██║██║███╗██║██╔══██║██╔══██╗██╔══╝
██║ ██║██║ ██║██║ ██║██████╔╝╚███╔███╔╝██║ ██║██║ ██║███████╗
╚═╝ ╚═╝╚═╝ ╚═╝╚═╝ ╚═╝╚═════╝ ╚══╝╚══╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚══════╝
DETECTION
✔ Hardware detected!
=== Hardware Detection ===
Summary:
4x NVIDIA GeForce RTX 4060 Ti (64GB VRAM) + Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
Tier: HIGH
Max model size: 62GB
Best backend: cuda
CPU:
Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
Cores: 36 (18 physical)
SIMD: AVX512
[OK] AVX-512
[OK] AVX2
CUDA:
Driver: 550.163.01
CUDA: 12.4
Total VRAM: 64GB
NVIDIA GeForce RTX 4060 Ti: 16GB
NVIDIA GeForce RTX 4060 Ti: 16GB
NVIDIA GeForce RTX 4060 Ti: 16GB
NVIDIA GeForce RTX 4060 Ti: 16GB
Fingerprint: cuda--rtx-4060-ti-64gb-x4
./rigrank run --model llama3.2 --debug
RigRank Benchmark
⣻ Gathering Telemetry...
✓ System Telemetry Clean
• CPU: Intel(R) Core(TM) i9-10980XE CPU @ 3.00GHz
• RAM: 257416 MB
• GPU: AD106 [GeForce RTX 4060 Ti]
✓ Ollama Connected