-
Notifications
You must be signed in to change notification settings - Fork 7
chore: proxy LangChain questions, normalize TF answers, cleanup #15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: save-chat-history
Are you sure you want to change the base?
Conversation
- Proxy question generation to new LangChain endpoint with a legacy fallback
().
- Normalize True/False answers in the frontend and use normalized pairs
().
- Remove unused variables / minor cleanup in chat UI.
- Keeps frontend API shape unchanged: returns { status: 'success', qa_pairs: [...] }.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR updates the application to integrate with a new LangChain-based AI question generation service while maintaining backward compatibility. The changes enable handling of mixed MCQ/True-False question formats and implement graceful fallback to legacy endpoints.
- Proxy layer now calls LangChain endpoint with fallback to legacy service
- Frontend normalizes True/False answers and removes redundant code
- Improved error handling and response mapping for different AI server formats
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| src/pages/Chat.tsx | Normalizes T/F answers to True/False, removes unused helper function, and updates question display logic |
| server/controllers/graphController.ts | Implements LangChain endpoint integration with legacy fallback and response format mapping |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
| // If the new AI server is inaccessible or returns 403, try legacy endpoint on port 5000 as a graceful fallback | ||
| const status = err?.response?.status; | ||
| console.error(`Primary questions endpoint failed (status=${status}). Attempting legacy fallback...`); | ||
| if (status === 403 || status === 404 || !response) { |
Copilot
AI
Oct 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The condition !response will always be false here since response is declared but never assigned when the try block fails. This condition should be removed as it's unreachable.
| if (status === 403 || status === 404 || !response) { | |
| if (status === 403 || status === 404) { |
| const qaPair = qaData.find(qa => qa.question === question); | ||
| return qaPair?.answer || 'No answer available'; | ||
| }; | ||
| // ...existing code... (no local helper required) |
Copilot
AI
Oct 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This placeholder comment should be removed as it doesn't provide any meaningful information and clutters the code.
| // ...existing code... (no local helper required) |
| const normalizedPairs = qaResponseData.qa_pairs.map((p: QAPair) => ({ | ||
| question: p.question, | ||
| answer: (p.answer === 'T' || p.answer === 'True') ? 'True' : (p.answer === 'F' || p.answer === 'False') ? 'False' : p.answer | ||
| })); |
Copilot
AI
Oct 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The normalization logic is duplicated across lines 286-289 and 709-712. Consider extracting this into a reusable function to avoid code duplication.
Summary [Planning to remove fallback to legacy before merging]
This PR updates the frontend and middleware to support the AI server’s new LangChain-based mixed MCQ/True/False generation.
It introduces improved error handling, fallback logic, and normalization for True/False responses while maintaining compatibility with legacy endpoints.
🚀 Motivation
With the AI server now generating richer question formats via LangChain, the frontend needed to:
/generate-questions-with-answersendpoint🔄 Backend Proxy Updates (
routes/aiRoutes.ts)http://localhost:8000/questions/{graph_id}?use_langchain=true5000/generate{ questions: [...] }→{ status, qa_pairs }textandcorrect_answerinto simple{ question, answer }pairs💬 Frontend Updates (
src/pages/Chat.tsx)'T'/'F'to'True'/'False'for consistency.normalizedPairsfor rendering and quiz progression.getAnswerForQuestion()since QA data is already normalized.normalizedPairsfor accurate total count and first question prompt.✅ Result
{ questions: [...] }format.