🧹 chore: Remove Deprecated Gemini 2.0 Models & Fix Mistral-Large-3 Context Window (#12453)

* chore: remove deprecated Gemini 2.0 models from default models list

Remove gemini-2.0-flash-001 and gemini-2.0-flash-lite from the Google
default models array, as they have been deprecated by Google.

Closes #12444

* fix: add mistral-large-3 max context tokens (256k)

Add mistral-large-3 with 255000 max context tokens to the mistralModels
map. Without this entry, the model falls back to the generic
mistral-large key (131k), causing context window errors when using
tools with Azure AI Foundry deployments.

Closes #12429

* test: add mistral-large-3 token resolution tests and fix key ordering

Add test coverage for mistral-large-3 context token resolution,
verifying exact match, suffixed variants, and longest-match precedence
over the generic mistral-large key. Reorder the mistral-large-3 entry
after mistral-large to follow the file's documented convention of
listing newer models last for reverse-scan performance.
This commit is contained in:
Danny Avila 2026-03-28 23:44:58 -04:00 committed by GitHub
parent fda1bfc3cc
commit f82d4300a4
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
3 changed files with 55 additions and 3 deletions

View file

@ -1813,3 +1813,57 @@ describe('GLM Model Tests (Zhipu AI)', () => {
});
});
});
describe('Mistral Model Tests', () => {
describe('getModelMaxTokens', () => {
test('should return correct tokens for mistral-large-3 (256k context)', () => {
expect(getModelMaxTokens('mistral-large-3', EModelEndpoint.custom)).toBe(
maxTokensMap[EModelEndpoint.custom]['mistral-large-3'],
);
});
test('should match mistral-large-3 for suffixed variants', () => {
expect(getModelMaxTokens('mistral-large-3-instruct', EModelEndpoint.custom)).toBe(
maxTokensMap[EModelEndpoint.custom]['mistral-large-3'],
);
});
test('should not match mistral-large-3 for generic mistral-large', () => {
expect(getModelMaxTokens('mistral-large', EModelEndpoint.custom)).toBe(
maxTokensMap[EModelEndpoint.custom]['mistral-large'],
);
expect(getModelMaxTokens('mistral-large-latest', EModelEndpoint.custom)).toBe(
maxTokensMap[EModelEndpoint.custom]['mistral-large'],
);
});
});
describe('matchModelName', () => {
test('should match mistral-large-3 exactly', () => {
expect(matchModelName('mistral-large-3', EModelEndpoint.custom)).toBe('mistral-large-3');
});
test('should match mistral-large-3 for prefixed/suffixed variants', () => {
expect(matchModelName('mistral/mistral-large-3', EModelEndpoint.custom)).toBe(
'mistral-large-3',
);
expect(matchModelName('mistral-large-3-instruct', EModelEndpoint.custom)).toBe(
'mistral-large-3',
);
});
test('should match generic mistral-large for non-3 variants', () => {
expect(matchModelName('mistral-large-latest', EModelEndpoint.custom)).toBe('mistral-large');
});
});
describe('findMatchingPattern', () => {
test('should prefer mistral-large-3 over mistral-large for mistral-large-3 variants', () => {
const result = findMatchingPattern(
'mistral-large-3-instruct',
maxTokensMap[EModelEndpoint.custom],
);
expect(result).toBe('mistral-large-3');
});
});
});