Jason Tudisco 4a002c85d4 feat: add streaming AI responses with smooth token-by-token rendering
Replaces batch AI responses with real-time SSE streaming from OpenRouter.
Tokens are buffered client-side and drained via requestAnimationFrame for
a smooth typing effect instead of choppy chunk dumps.

Backend:
- Rewrite openrouter service for SSE streaming with incremental tool call accumulation
- Add AiStreamChunk/AiStreamEnd WebSocket event types
- Stream content deltas to clients during all tool call rounds
- Increase broadcast channel capacity (256 -> 4096) and handle Lagged errors gracefully

Frontend:
- Add StreamBuffer utility with adaptive rAF-based character draining
- Show streaming message-bubble with blinking cursor during generation
- Clean up buffer on room switch and final message replacement

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-06 20:23:49 -06:00
Description
No description provided
2 MiB
Languages
Rust 44.8%
Riot 41.1%
JavaScript 7.9%
Shell 2.9%
CSS 2%
Other 1.3%