Is your feature request related to a problem?
The current /chat interface redirects to /evaluations, limiting user interaction. Users need a full chat interface to select configurations, ask questions, and receive responses from the LLM while maintaining conversation history.
Describe the solution you'd like
- Implement a text input/output interface; consider image, audio, and PDF inputs for future versions.
- Support both single-turn and multi-turn chat with server-side conversation IDs.
- Enable webhook-based response delivery for asynchronous LLM API responses.
- Ensure consistent authentication gating; unauthenticated users can access the chat UI but must log in to send messages.
Original issue
Is your feature request related to a problem?
The current /chat interface redirects to /evaluations, which limits user interaction. Users need a full chat interface to select configurations, ask questions, and receive responses from the LLM, while maintaining conversation history.
Describe the solution you'd like
- Implement a text input/output interface; image, audio, and PDF inputs to be considered for future versions.
- Support both single-turn and multi-turn chat with server-side conversation IDs.
- Enable webhook-based response delivery for asynchronous LLM API responses.
- Ensure authentication gating is consistent; unauthenticated users can access the chat UI but must log in to send messages.
Original issue
Is your feature request related to a problem? Please describe.
Replace the root /chat redirect-to-/evaluations with a full chat interface. Users select a configuration, type a question, and get a response back from the configured LLM. Conversation history is kept within the session.
Describe the solution you'd like
- Text input / text output. Image, audio, and PDF inputs are out of scope for v1; the request shape is built so they can be added without rework.
- Single-turn and multi-turn (server-side conversation IDs) chat.
- Webhook-based response delivery (LLM API delivers responses asynchronously via callback).
- Auth gating consistent with the rest of the app: unauthenticated users see the chat UI but are routed through the existing login modal when they try to send.
Is your feature request related to a problem?
The current
/chatinterface redirects to/evaluations, limiting user interaction. Users need a full chat interface to select configurations, ask questions, and receive responses from the LLM while maintaining conversation history.Describe the solution you'd like
Original issue
Is your feature request related to a problem?
The current
/chatinterface redirects to/evaluations, which limits user interaction. Users need a full chat interface to select configurations, ask questions, and receive responses from the LLM, while maintaining conversation history.Describe the solution you'd like
Original issue
Is your feature request related to a problem? Please describe.
Replace the root
/chatredirect-to-/evaluations with a full chat interface. Users select a configuration, type a question, and get a response back from the configured LLM. Conversation history is kept within the session.Describe the solution you'd like