Releases: undreamai/LLMUnity
Releases Β· undreamai/LLMUnity
Release v1.2.8
27 May 11:28
Compare
Sorry, something went wrong.
No results found
π Features
Switch to llamafile v0.8.6 (PR: #155 )
Add phi-3 support (PR: #156 )
Release v1.2.7
19 Apr 17:04
Compare
Sorry, something went wrong.
No results found
π Features
Add Llama 3 and Vicuna chat templates (PR: #145 )
π¦ General
Use the context size of the model by default for longer history (PR: #147 )
Release v1.2.6
01 Apr 08:17
Compare
Sorry, something went wrong.
No results found
π Features
Add documentation (PR: #135 )
π Fixes
Add server security for interceptions from external llamafile servers (PR: #132 )
Adapt server security for macOS (PR: #137 )
π¦ General
Add sample to demonstrates the async functionality (PR: #136 )
Release v1.2.5
23 Mar 09:24
Compare
Sorry, something went wrong.
No results found
π Fixes
Add to chat history only if the response is not null (PR: #123 )
Allow SetTemplate function in Runtime (PR: #129 )
Release v1.2.4
13 Mar 18:17
Compare
Sorry, something went wrong.
No results found
π Features
Use llamafile v0.6.2 (PR: #111 )
Pure text completion functionality (PR: #115 )
Allow change of roles after starting the interaction (PR: #120 )
π Fixes
use Debug.LogError instead of Exception for more verbosity (PR: #113 )
Trim chat responses (PR: #118 )
Fallback to CPU for macOS with unsupported GPU (PR: #119 )
Removed duplicate EditorGUI.EndChangeCheck() (PR: #110 )
π¦ General
Provide access to LLMUnity version (PR: #117 )
Rename to "LLM for Unity" (PR: #121 )
Release v1.2.3
09 Mar 10:43
Compare
Sorry, something went wrong.
No results found
π Fixes
Fix async server 2 (PR: #108 )
Release v1.2.2
07 Mar 20:01
Compare
Sorry, something went wrong.
No results found
π Fixes
use namespaces in all classes (PR: #104 )
await separately in StartServer (PR: #107 )
Release v1.2.1
07 Mar 12:38
Compare
Sorry, something went wrong.
No results found
π Fixes
Kill server after Unity crash (PR: #101 )
Persist chat template on remote servers (PR: #103 )
Release v1.2.0
29 Feb 19:39
Compare
Sorry, something went wrong.
No results found
π Features
LLM server unit tests (PR: #90 )
Implement chat templates (PR: #92 )
Stop chat functionality (PR: #95 )
Keep only the llamafile binary (PR: #97 )
π Fixes
Fix remote server functionality (PR: #96 )
Fix Max issue needing to run llamafile manually the first time (PR: #98 )
π¦ General
Async startup support (PR: #89 )
Release v1.1.1
19 Feb 13:44
Compare
Sorry, something went wrong.
No results found
π¦ General
Refactoring and small enhancements (PR: #80 )