-
Notifications
You must be signed in to change notification settings - Fork 10
feat(ai-proxy): feat(agent): add AI integration with addAI customization method #1367
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Add support for AI/LLM integration in the agent package. - New addAI customization method for configuring AI provider - AI proxy route handler at /_internal/ai-proxy/:route - Integration with MCP server configuration service - Schema flag for AI enabled projects 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
- Update @forestadmin/ai-proxy to 1.0.0 in agent and forestadmin-client - Add resolution in root package.json to force 1.0.0 - Allow string type for model to support custom models or new versions 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
2 new issues
|
a0e7344 to
f766cb7
Compare
|
Coverage Impact ⬆️ Merging this pull request will increase total coverage on Modified Files with Diff Coverage (3)
🛟 Help
|
c70778f to
25659a5
Compare
- Add MistralConfiguration type with provider, model, and apiKey - Add MistralUnprocessableError for Mistral-specific errors - Implement Mistral dispatcher using @langchain/mistralai - Return LangChain AIMessage format directly for frontend compatibility - Add comprehensive tests for Mistral provider 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
- Add MistralConfiguration type with provider, model, and apiKey - Use @mistralai/mistralai SDK for Mistral calls - Convert Mistral responses to OpenAI format for frontend compatibility - Frontend always uses ChatOpenAI, backend handles provider abstraction - Add comprehensive tests for both providers 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
25e8a82 to
f6d3514
Compare
7136f39 to
7917528
Compare
- Replace native Mistral SDK with @langchain/mistralai - Add @langchain/openai for OpenAI provider - Both providers return AIMessage, converted to OpenAI format - Simplifies adding new providers in the future 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
7917528 to
7a5ff41
Compare
- Fix incorrect provider in example (openai → mistral) - Add error handling in handleAiProxy route (convert AI errors to UnprocessableError) - Add validation for unsupported providers and missing API keys - Narrow catch block scope and preserve error context with cause property - Pin LangChain dependency versions for consistency - Add Mistral model autocomplete with WithAutocomplete utility type - Add tests for new validation errors and error handling 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Update JSDoc to explain that AI requests are forwarded to the agent and processed locally, not accessible by Forest Admin for privacy. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
- Add detailed description of privacy benefits - Document supported providers (openai, mistral) - Add @param documentation for configuration properties - Add @returns and @throws annotations - Include examples for both OpenAI and Mistral providers 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Code reviewFound 2 issues:
agent-nodejs/packages/ai-proxy/src/router.ts Lines 63 to 67 in 2130789
agent-nodejs/packages/ai-proxy/src/router.ts Lines 63 to 67 in 2130789
Suggested fix for both issues: if (args.route === 'invoke-remote-tool') {
if (!args.query?.['tool-name']) {
throw new AIUnprocessableError('Missing required query parameter: tool-name');
}
if (!args.body || !('inputs' in args.body)) {
throw new AIUnprocessableError('Missing or invalid body for invoke-remote-tool');
}
return await remoteTools.invokeTool(
args.query['tool-name'],
(args.body as InvokeRemoteToolBody).inputs,
);
}🤖 Generated with Claude Code - If this code review was useful, please react with 👍. Otherwise, react with 👎. |
- Replace custom LangChainClient type with BaseChatModel from @langchain/core - Update KnownMistralModels with latest model names from Mistral docs - Add magistral (reasoning), pixtral, and open-mistral-nemo models - Fix AIError test to verify UnprocessableError type conversion - Add validation tests for whitespace-only and undefined apiKey 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Mistral API uses "any" instead of "required" to force tool use. Convert OpenAI's tool_choice: "required" to "any" for Mistral. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
MCP tools use Zod schemas internally. Convert them to JSON Schema using toJsonSchema() before passing to the LLM, matching what toolDefinitionsForFrontend does. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Pass tools and tool_choice directly to invoke() instead of using bindTools(). This ensures tool_choice is properly passed to the LLM API, fixing an issue where Mistral would return empty responses despite tool_choice: "any". 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Reverted to using bindTools() since invoke() options don't support tools parameter. Added debug logging for Mistral to trace tool calls. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Mistral requires tools to have a non-empty description. When MCP servers provide tools without descriptions, we now add a fallback description. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Add logging before/after API call and in catch block to understand where the request is failing with Mistral. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
- Remove $schema field from parameters (Mistral doesn't support it) - Ensure all tools have non-empty descriptions 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Add timeout to detect if Mistral API is hanging. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
- Convert tool_choice "required" to "any" for Mistral - Ensure all tools have non-empty descriptions - Remove $schema field from tool parameters - Clean up debug logging 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Remove unnecessary Mistral-specific cleanup (description fallback, $schema removal) - tested and works without it. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <[email protected]>
Auto code reviewNo issues found. Checked for bugs and CLAUDE.md compliance. 🤖 Generated with Claude Code |

Summary
@langchain/mistralaifor Mistral integrationChanges
MistralConfigurationtype withprovider: 'mistral'MistralUnprocessableErrorfor Mistral-specific errorsdispatchMistralmethod inProviderDispatcherUsage
Test plan
🤖 Generated with Claude Code