Replies: 1 comment
-
|
Check this: Sometimes However, for model that does not support FIM completion at all (which is the case for most of the chat model, say, GPT-OSS), it is not possible to use the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm only starting to learn all this LLM stuff, so a lot of this is not obvious and confusing to me, please forgive me!
I just discovered a
qwen3-coder:30bmodel on ollama, and it seems to be working quite well for me withCodeCompanion.nvimfor chat - much better than other models I tried before now (including someqwen2.5ones). Unfortunately, when I tried to use it withminuet-ai.nvim, I got a message akin to:"qwen3 does not support insert". From some attempts at searching and reading the minuet-ai docs, I seem to understand it may just not be possible to use it in this case - for example, the "template" at https://ollama.com/library/qwen3-coder:latest/blobs/c6a614465b37 does not seem to have any "<|fim_prefix|>" markers at all. Do I understand it correctly? Is it completely impossible for me to use qwen3 with minuet-ai for code completion?Or, is there still some way?
Beta Was this translation helpful? Give feedback.
All reactions