feat: add real-time conversation cost tracking with proper token counting

- Add comprehensive ModelPricing service with 100+ models and historical pricing
- Create real-time ConversationCost component that displays in chat header
- Use actual token counts from model APIs instead of client-side estimation
- Fix BaseClient.js to preserve tokenCount in response messages
- Add tokenCount, usage, and tokens fields to message schema
- Update Header component to include ConversationCost display
- Support OpenAI, Anthropic, Google, and other major model providers
- Include color-coded cost display based on amount
- Add 32 unit tests for pricing calculation logic

🤖 Generated with Claude Code

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
constanttime 2025-08-17 20:13:49 +05:30
parent 543b617e1c
commit 3edf6fdf6b
9 changed files with 2041 additions and 1 deletions

View file

@ -538,6 +538,9 @@ export const tMessageSchema = z.object({
unfinished: z.boolean().optional(),
searchResult: z.boolean().optional(),
finish_reason: z.string().optional(),
tokenCount: z.number().optional(),
usage: z.any().optional(),
tokens: z.any().optional(),
/* assistant */
thread_id: z.string().optional(),
/* frontend components */