Skip to content

Infinite retry when LLM not working #177

@CliffHan

Description

@CliffHan

Describe the bug

Although there's MAX_RETRY_ATTEMPTS defined in the code, it will keep retrying when LLM returns wrong response.

getConversationTitleFromLocalModel

To Reproduce

Steps to reproduce the behavior:

  1. just set a wrong LLM url
  2. send something to the wrong LLM
  3. See error

Expected behavior

Retry MAX_RETRY_ATTEMPTS and stop retrying

Additional context

Add any other context about the problem here.

checked the code and saw something, e.g in the getConversationTitleFromLocalModel()

https://github.com/fingerthief/minimal-chat/blob/main/src/libs/api-access/open-ai-api-standard-access.js#L205

every time retry in this function, the retryTimes will be reset
and I'm not sure whether "retryCounters.title" could be updated in the handleRetry() function

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions