Releases: undreamai/LLMUnity
Releases Β· undreamai/LLMUnity
Release v3.0.1
28 Jan 14:26
Compare
Sorry, something went wrong.
No results found
π Features
add Unity.Nuget.Newtonsoft-Json in the assembly definition (PR: #379 )
Update LlamaLib to v2.0.2 (llama.cpp b7777) (PR: #380 )
π Fixes
fix running in Editor with Android/iOS platform selected (PR: #378 )
Release v3.0.0
12 Jan 22:19
Compare
Sorry, something went wrong.
No results found
Rewrite the LLM backend, LlamaLib, as a standalone C++/C# library and adaptation of LLMUnity.
π Features
implement LlamaLib as object-oriented C++/C# library
update llama.cpp to b7664
fix Vulkan GPU backend
Android 16kb support
fix iOS - xcode building
fix RAG functionality for iOS
polish samples
optimise streaming functionality and implement callbacks on the C++ end
remove chat templates from LLMUnity, use the llama.cpp templating
implement property checks
common handlng for both json and gbnf grammars
simplify integration of tinyBLAS (light GPU backend for Nvidia GPUs)
transition of client / server functionality in LlamaLib
Release v2.5.2
29 May 18:31
Compare
Sorry, something went wrong.
No results found
π Features
Support Android x86-64 architecture (Magic Leap 2) (PR: #344 )
Combine ARM and Intel architectures of macOS (PR: #345 )
Release v2.5.1
05 May 12:15
Compare
Sorry, something went wrong.
No results found
π Features
Allow JSON schema grammars (PR: #333 )
Add support for Qwen3 models (PR: #335 )
Add support for BitNet models (PR: #334 )
Upgrade LlamaLib to v1.2.5 (llama.cpp b5261) (PR: #335 )
π Fixes
Fix Unity Editor hanging after stopping a completion and restarting scene (PR: #335 )
Release v2.5.0
28 Mar 14:55
Compare
Sorry, something went wrong.
No results found
π Features
VisionOS support (PR: #299 )
Add support for Gemma 3 and Phi 4 models (PR: #327 )
Fix Android support for older devices (use ARMv8-A instead of ARMv8.4-A) (PR: #325 )
Upgrade LlamaLib to v1.2.4 (llama.cpp b4969) (PR: #325 )
Default number of predicted tokens (num_predict) to infinity (-1) (PR: #328 )
Release v2.4.2
19 Feb 13:07
Compare
Sorry, something went wrong.
No results found
π Features
Integrate DeepSeek models (PR: #312 )
Update LlamaLib to v1.2.3 (llama.cpp b4688) (PR: #312 )
Drop CUDA 11.7.1 support (PR: #312 )
Add warm-up function for provided prompt (PR: #301 )
Add documentation in Unity tooltips (PR: #302 )
π Fixes
Fix code signing on iOS (PR: #298 )
Persist debug mode and use of extras to the build (PR: #304 )
Fix dependency resolution for full CUDA and vulkan architectures (PR: #313 )
Release v2.4.1
18 Dec 11:27
Compare
Sorry, something went wrong.
No results found
π Features
Static library linking on mobile (fixes iOS signing) (PR: #289 )
π Fixes
Fix support for extras (flash attention, iQ quants) (PR: #292 )
Release v2.4.0
02 Dec 16:40
Compare
Sorry, something went wrong.
No results found
π Features
iOS deployment (PR: #267 )
Improve building process (PR: #282 )
Add structured output / function calling sample (PR: #281 )
Update LlamaLib to v1.2.0 (llama.cpp b4218) (PR: #283 )
π Fixes
Clear temp build directory before building (PR: #278 )
π¦ General
Remove support for extras (flash attention, iQ quants) (PR: #284 )
remove support for LLM base prompt (PR: #285 )
Release v2.3.0
04 Nov 13:37
Compare
Sorry, something went wrong.
No results found
π Features
Implement Retrieval Augmented Generation (RAG) in LLMUnity (PR: #246 )
π Fixes
Fixed build conflict, endless import of resources. (PR: #266 )
Release v2.2.4
26 Sep 16:08
Compare
Sorry, something went wrong.
No results found
π Features
Add Phi-3.5 and Llama 3.2 models (PR: #255 )
Speedup LLMCharacter warmup (PR: #257 )
π Fixes
Fix handling of incomplete requests (PR: #251 )
Fix Unity locking of DLLs during cross-platform build (PR: #252 )
Allow spaces in lora paths (PR: #254 )
π¦ General
Set default context size to 8192 and allow to adjust with a UI slider (PR: #258 )