Skip to content

feat: context aware prompt#7174

Open
Reisenbug wants to merge 3 commits intoAstrBotDevs:masterfrom
Reisenbug:feat/context-aware-prompt
Open

feat: context aware prompt#7174
Reisenbug wants to merge 3 commits intoAstrBotDevs:masterfrom
Reisenbug:feat/context-aware-prompt

Conversation

@Reisenbug
Copy link
Copy Markdown
Contributor

@Reisenbug Reisenbug commented Mar 30, 2026

群聊上下文感知功能的提示词此前均为硬编码,用户无法根据自己的 Bot 人格或语言风格进行调整。

Modifications / 改动点

  • astrbot/core/config/default.py:在 provider_ltm_settings 下新增两个可配置字段:

  • context_prompt:上下文感知提示词(引导 LLM 理解聊天记录的前缀,普通模式与主动回复模式均生效)

  • active_reply_suffix_prompt:主动回复后缀提示词(主动回复模式下追加在消息末尾的指令,仅主动回复启用时在
    Dashboard 中显示)

  • astrbot/builtin_stars/astrbot/long_term_memory.py:从 配置读取上述字段,替换原有硬编码字符串

  • dashboard/src/i18n/locales/zh-CNen-US/features/confi g-metadata.json:新增对应 i18n 翻译

  • This is NOT a breaking change. / 这不是一个破坏性变更。

Screenshots or Test Results / 运行截图或测试结果


Checklist / 检查清单

  • 😊 If there are new features added in the PR, I have discussed it with the authors through issues/emails, etc.
    / 如果 PR 中有新加入的功能,已经通过 Issue / 邮件等方式和作者讨论过。

  • 👀 My changes have been well-tested, and "Verification Steps" and "Screenshots" have been provided above.
    / 我的更改经过了良好的测试,并已在上方提供了“验证步骤”和“运行截图”

  • 🤓 I have ensured that no new dependencies are introduced, OR if new dependencies are introduced, they have been added to the appropriate locations in requirements.txt and pyproject.toml.
    / 我确保没有引入新依赖库,或者引入了新依赖库的同时将其添加到 requirements.txtpyproject.toml 文件相应位置。

  • 😮 My changes do not introduce malicious code.
    / 我的更改没有引入恶意代码。

Summary by Sourcery

Make group chat long-term memory prompts configurable and expose them in the provider settings UI.

New Features:

  • Add configurable context prompt for group chat history used in both normal and active reply modes.
  • Add configurable active-reply suffix prompt appended to messages when active reply is enabled.

Enhancements:

  • Replace hard-coded long-term memory prompts with values loaded from provider_ltm_settings, with sensible defaults.
  • Extend provider configuration metadata to surface the new prompt fields in the dashboard only when active reply is enabled.

Documentation:

  • Add i18n entries for the new context and active-reply prompt configuration fields in both Chinese and English locales.

@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. feature:persona The bug / feature is about astrbot AI persona system (system prompt) labels Mar 30, 2026
Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've left some high level feedback:

  • The default values for context_prompt and active_reply_suffix_prompt are now hardcoded both in default.py and as .get(..., default) values in long_term_memory.py; consider relying on the config defaults only to avoid divergence if one side is changed later.
  • In ChatProviderTemplate’s schema definition, provider_ltm_settings.active_reply_suffix_prompt appears to be nested inside the provider_ltm_settings.active_reply block; please double-check the brace/indent structure so that active_reply_suffix_prompt is defined at the intended level and the condition applies correctly in the dashboard.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- The default values for `context_prompt` and `active_reply_suffix_prompt` are now hardcoded both in `default.py` and as `.get(..., default)` values in `long_term_memory.py`; consider relying on the config defaults only to avoid divergence if one side is changed later.
- In `ChatProviderTemplate`’s `schema` definition, `provider_ltm_settings.active_reply_suffix_prompt` appears to be nested inside the `provider_ltm_settings.active_reply` block; please double-check the brace/indent structure so that `active_reply_suffix_prompt` is defined at the intended level and the condition applies correctly in the dashboard.

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces configurable prompts for long-term memory and active replies, replacing hardcoded strings with context_prompt and active_reply_suffix_prompt. The changes span the core logic, default settings, and dashboard localization. A critical syntax error was identified in astrbot/core/config/default.py, where the new configuration metadata is incorrectly nested within the whitelist dictionary, resulting in a malformed structure that will cause a Python syntax error.

@Reisenbug Reisenbug changed the title Feat/context aware prompt feat: context aware prompt Mar 30, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feature:persona The bug / feature is about astrbot AI persona system (system prompt) size:M This PR changes 30-99 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant