- Add comprehensive ModelPricing service with 100+ models and historical pricing
- Create real-time ConversationCost component that displays in chat header
- Use actual token counts from model APIs instead of client-side estimation
- Fix BaseClient.js to preserve tokenCount in response messages
- Add tokenCount, usage, and tokens fields to message schema
- Update Header component to include ConversationCost display
- Support OpenAI, Anthropic, Google, and other major model providers
- Include color-coded cost display based on amount
- Add 32 unit tests for pricing calculation logic
🤖 Generated with Claude Code
Co-Authored-By: Claude <noreply@anthropic.com>