Merged
Conversation
- Removed outdated headless routes from `next.config.mjs` to streamline navigation. - Added new documentation for headless features, including `useCopilotEvent` and `useMessageMeta`, to enhance user understanding. - Introduced client-side and server-side skills documentation, detailing registration and usage. - Created new `headless` and `skills` sections in the documentation, including examples and best practices for implementation. - Deleted the `skills.mdx` file as part of the restructuring to improve clarity and organization.
- Modified the PATH variable in the pre-commit script to include the user's pnpm directory for better compatibility in non-interactive shells.
All four legacy adapters (OpenAI, Azure, Google, Ollama) were yielding action:call events but never emitting action:end, preventing the runtime from knowing when a tool call finished and triggering execution. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add native Together AI provider using OpenAI-compatible API with streaming, tool calling, vision, and JSON mode support. Includes full docs page, Express+Vite demo app, and comprehensive test suite. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
… beta release notes Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Vercel uses npm which doesn't support pnpm's workspace: protocol, causing EUNSUPPORTEDPROTOCOL errors during install. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…tProvider - Add createTogetherAI() legacy factory (returns AIProvider for createRuntime) - Rebuild demo from Express+Vite to Next.js app using the proper copilot-sdk pattern: createRuntime + runtime.handleRequest on backend, CopilotProvider + CopilotChat on frontend - Sidebar with model selector, setup guide, and links - 11 verified models across DeepSeek, Llama, Qwen, and more Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Rohitjoshi9023
approved these changes
Apr 9, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What's in this PR
Bug Fix
action:endevents in streaming tool calls — all four legacy adapters (OpenAI, Azure, Google, Ollama) were yieldingaction:callevents but never emittingaction:end, preventing the runtime from triggering tool execution during streaming.New Feature: Together AI Provider
@yourgpt/llm-sdk/togetheraiprovider — native Together AI support using OpenAI-compatible API with streaming, tool calling, vision, and JSON modeexamples/togetherai-demo/) — Express + Vite demo with model selector dropdown and Copilot Chat UIDocs Cleanup
Testing
pnpm buildpasses for llm-sdk with new provider