Replaces batch AI responses with real-time SSE streaming from OpenRouter.
Tokens are buffered client-side and drained via requestAnimationFrame for
a smooth typing effect instead of choppy chunk dumps.
Backend:
- Rewrite openrouter service for SSE streaming with incremental tool call accumulation
- Add AiStreamChunk/AiStreamEnd WebSocket event types
- Stream content deltas to clients during all tool call rounds
- Increase broadcast channel capacity (256 -> 4096) and handle Lagged errors gracefully
Frontend:
- Add StreamBuffer utility with adaptive rAF-based character draining
- Show streaming message-bubble with blinking cursor during generation
- Clean up buffer on room switch and final message replacement
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Full-stack real-time group chat with Rust/Axum backend and Riot.js frontend.
Features:
- Auth (register/login/JWT), rooms, invites, WebSocket messaging
- AI responses via OpenRouter with tool calling (Brave Search + web fetch)
- Real-time tool usage indicators (searching/reading page)
- Collapsible tool results in message bubbles
- AI stats bar (model, tokens, speed, response time) persisted to DB
- Room soft-delete, /clear command, dynamic model fetching
- Markdown rendering with code highlighting and copy buttons
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>