feat: add AWS Bedrock provider support#3402
Conversation
Add AWS Bedrock as a new LLM provider with dynamic model fetching via inference profiles API. Includes support for Claude, Nova, Llama and other Bedrock models.
- Add type field to prevent unnecessary API calls to Chatbox AI backend - Increase maxResults to 1000 for faster profile fetching - Extract accurate token limits from converse.maxTokensMaximum field - Parse context window from model descriptions (e.g., Nova 2 Lite 1M context) - Support LEGACY status models (e.g., Claude 3.5 Sonnet)
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
- Add Bedrock credentials check to needEditSetting - Change AWS Region from dropdown to text input for flexibility
577ea96 to
bbe672c
Compare
|
Hi i need the bedrock support. |
themez
left a comment
There was a problem hiding this comment.
Thanks for the contribution! Here's a detailed review:
Critical Issues
1. Dependencies should be in devDependencies
@ai-sdk/amazon-bedrock and @aws-sdk/client-bedrock are added to dependencies, but per project convention, renderer deps must go in devDependencies because electron-vite bundles renderer code. ESM-only packages in dependencies will fail with require() of ES Module not supported.
2. Multiple any type usages
The project has a strict rule: never use any type. Several violations:
bedrock-setting-util.ts:const clientConfig: any = {bedrock-setting-util.ts: multiple(model as any).converse,(response as any).nextToken
Please define proper types or use unknown with type guards.
3. Unrelated changes mixed into this PR
package-lock.json: package name changed fromxyz.chatboxapp.apptoxyz.chatboxapp.ce— unrelated to Bedrockindex.ts: addsOpenRouterSettingUtilimport and registration — unrelated to Bedrock
These should be in separate PRs.
4. Provider type may be incorrect
type: ModelProviderType.OpenAI, // Bedrock is not an OpenAI-compatible APIPlease verify whether this field affects runtime behavior and if a different type is more appropriate.
Should Fix
5. Import paths should use project aliases
// ❌
import type { ... } from 'src/shared/types'
// ✅
import type { ... } from '@shared/types'6. Fragile capability detection
const hasReasoning =
(model as any).converse?.reasoningSupported !== undefined ||
model.modelId.includes('sonnet-4') ||
model.modelId.includes('opus-4') ||
model.modelId.includes('claude-3-7')Hardcoded model name patterns will break as new models are released. Prefer relying on structured API response data.
7. @aws-sdk/client-bedrock bundle size concern
This package is only used for ListInferenceProfiles and ListFoundationModels. The full AWS SDK client is heavy and will significantly increase the renderer bundle. Consider whether direct REST API calls with AWS Signature V4 signing would be lighter, or evaluate the actual post-tree-shaking impact.
8. Silent error handling in listProviderModels
The catch block silently returns an empty array. Users cannot distinguish between "no models available" and "invalid credentials". Please surface the error.
9. Inconsistent default model nicknames
nickname: 'Global Anthropic Claude Haiku 4.5' // "Global"
nickname: 'GLOBAL Amazon Nova 2 Lite' // "GLOBAL"
nickname: 'US Anthropic Claude Opus 4.5' // "US"Pick a consistent casing style.
Minor
getCallSettingshas an unusedoptionsparameter — can be removedawsSessionTokenis supported in code but has no input field in the settings UI — if intentional, consider exposing it
Fixes #1156
Add AWS Bedrock as a new LLM provider with support for Claude, Nova, Llama and other Bedrock models.
Features
Technical Notes