Skip to content

feat: add AWS Bedrock provider support#3402

Open
zw2497 wants to merge 4 commits intochatboxai:mainfrom
zw2497:feat/add-bedrock-provider
Open

feat: add AWS Bedrock provider support#3402
zw2497 wants to merge 4 commits intochatboxai:mainfrom
zw2497:feat/add-bedrock-provider

Conversation

@zw2497
Copy link
Copy Markdown

@zw2497 zw2497 commented Jan 16, 2026

Fixes #1156

Add AWS Bedrock as a new LLM provider with support for Claude, Nova, Llama and other Bedrock models.

Features

  • Dynamic model fetching via AWS ListInferenceProfiles API
  • Auto-detect model capabilities (vision, tool_use, reasoning)
  • AWS credentials configuration (Access Key + Secret + Region)
  • 4 default models included

Technical Notes

  • Uses @ai-sdk/[email protected] (for compatibility with @ai-sdk/provider v2)
  • Uses inference profile IDs (e.g. global.anthropic.claude-sonnet-4-5-20250929-v1:0) for on-demand throughput

Add AWS Bedrock as a new LLM provider with dynamic model fetching via inference profiles API.
Includes support for Claude, Nova, Llama and other Bedrock models.
- Add type field to prevent unnecessary API calls to Chatbox AI backend
- Increase maxResults to 1000 for faster profile fetching
- Extract accurate token limits from converse.maxTokensMaximum field
- Parse context window from model descriptions (e.g., Nova 2 Lite 1M context)
- Support LEGACY status models (e.g., Claude 3.5 Sonnet)
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Jan 16, 2026

Important

Review skipped

Auto reviews are disabled on this repository.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

- Add Bedrock credentials check to needEditSetting
- Change AWS Region from dropdown to text input for flexibility
@ghoshpushpendu
Copy link
Copy Markdown

Hi

i need the bedrock support.
Can we get this reviewed asap please ?

Copy link
Copy Markdown
Contributor

@themez themez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the contribution! Here's a detailed review:

Critical Issues

1. Dependencies should be in devDependencies
@ai-sdk/amazon-bedrock and @aws-sdk/client-bedrock are added to dependencies, but per project convention, renderer deps must go in devDependencies because electron-vite bundles renderer code. ESM-only packages in dependencies will fail with require() of ES Module not supported.

2. Multiple any type usages
The project has a strict rule: never use any type. Several violations:

  • bedrock-setting-util.ts: const clientConfig: any = {
  • bedrock-setting-util.ts: multiple (model as any).converse, (response as any).nextToken

Please define proper types or use unknown with type guards.

3. Unrelated changes mixed into this PR

  • package-lock.json: package name changed from xyz.chatboxapp.app to xyz.chatboxapp.ce — unrelated to Bedrock
  • index.ts: adds OpenRouterSettingUtil import and registration — unrelated to Bedrock

These should be in separate PRs.

4. Provider type may be incorrect

type: ModelProviderType.OpenAI,  // Bedrock is not an OpenAI-compatible API

Please verify whether this field affects runtime behavior and if a different type is more appropriate.

Should Fix

5. Import paths should use project aliases

// ❌ 
import type { ... } from 'src/shared/types'
// ✅ 
import type { ... } from '@shared/types'

6. Fragile capability detection

const hasReasoning =
  (model as any).converse?.reasoningSupported !== undefined ||
  model.modelId.includes('sonnet-4') ||
  model.modelId.includes('opus-4') ||
  model.modelId.includes('claude-3-7')

Hardcoded model name patterns will break as new models are released. Prefer relying on structured API response data.

7. @aws-sdk/client-bedrock bundle size concern
This package is only used for ListInferenceProfiles and ListFoundationModels. The full AWS SDK client is heavy and will significantly increase the renderer bundle. Consider whether direct REST API calls with AWS Signature V4 signing would be lighter, or evaluate the actual post-tree-shaking impact.

8. Silent error handling in listProviderModels
The catch block silently returns an empty array. Users cannot distinguish between "no models available" and "invalid credentials". Please surface the error.

9. Inconsistent default model nicknames

nickname: 'Global Anthropic Claude Haiku 4.5'   // "Global"
nickname: 'GLOBAL Amazon Nova 2 Lite'            // "GLOBAL"
nickname: 'US Anthropic Claude Opus 4.5'         // "US"

Pick a consistent casing style.

Minor

  • getCallSettings has an unused options parameter — can be removed
  • awsSessionToken is supported in code but has no input field in the settings UI — if intentional, consider exposing it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] Add Bedrock integration to support Amazon Titan, Jurassic AI, Anthropic's Models via AWS Account

3 participants