Skip to content

feat(opencode): add dedicated UI for model token limits (limit.context & limit.output)#2022

Open
wwminger wants to merge 1 commit intofarion1231:mainfrom
wwminger:feat/opencode-limit-ui
Open

feat(opencode): add dedicated UI for model token limits (limit.context & limit.output)#2022
wwminger wants to merge 1 commit intofarion1231:mainfrom
wwminger:feat/opencode-limit-ui

Conversation

@wwminger
Copy link
Copy Markdown
Contributor

Summary

Previously, there was no way to edit OpenCode model token limits in the UI. The limit field was excluded from extra fields as a known/reserved key (OPENCODE_KNOWN_MODEL_KEYS), but had no dedicated input. Users could only set limits via presets.

This PR adds a Token Limits section with number inputs for limit.context (context window) and limit.output (max output), placed between Model Properties and SDK Options in the expanded model details.

Changes

  • OpenCodeFormFields.tsx: Added handleModelLimitContextChange and handleModelLimitOutputChange handlers + Token Limits UI section
  • src/i18n/locales/en.json, zh.json, ja.json: Added 6 i18n keys (tokenLimits, limitContext, limitOutput, limitContextPlaceholder, limitOutputPlaceholder, tokenLimitsHint)

Testing

  • pnpm typecheck passed
  • Manual verification: Token Limits section appears in expanded model details with working number inputs

Screenshots

The Token Limits section appears when expanding a model in the OpenCode provider form, allowing users to set context window and max output token limits directly.

…t & limit.output)

Previously, there was no way to edit OpenCode model token limits in the
UI. The 'limit' field was excluded from extra fields as a known/reserved
key, but had no dedicated input. Users could only set limits via presets.

Add a 'Token Limits' section with number inputs for context window and
max output, placed between Model Properties and SDK Options in the
expanded model details.

Includes i18n for en/zh/ja locales.
@farion1231
Copy link
Copy Markdown
Owner

@codex review

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 72e157728e

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +476 to +477
if (numValue === undefined && currentLimit.output === undefined) {
const { limit: _, ...rest } = model;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve other limit keys when clearing context/output

When the user clears context here, the branch deletes the whole limit object as soon as output is undefined, which also drops any existing limit.input (supported in OpenCode model metadata) or future limit fields. This causes silent config data loss for models imported from richer configs; the sibling handleModelLimitOutputChange has the same pattern. Only remove limit after checking that no keys remain after applying the edit.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants