You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add groq AI mode provider defaults and docs (#2942)
This change adds first-class `groq` provider support to Wave AI mode
resolution and documents it in the Wave AI modes guide. Users can now
configure Groq modes via `ai:provider` with provider defaults applied
automatically.
- **Provider support in backend config resolution**
- Added `groq` as a recognized AI provider constant.
- Added Groq provider defaults in mode resolution:
- `ai:apitype`: `openai-chat`
- `ai:endpoint`: `https://api.groq.com/openai/v1/chat/completions`
- `ai:apitokensecretname`: `GROQ_KEY`
- **Schema/config surface update**
- Extended `AIModeConfigType` provider enum to include `groq`, so
`ai:provider: "groq"` is valid in Wave AI config.
- **Documentation updates (`waveai-modes.mdx`)**
- Added `groq` to supported providers.
- Added a Groq-specific configuration example and default behavior
notes.
- Updated provider reference and capability guidance to include Groq.
- **Focused coverage**
- Added a targeted unit test for Groq provider default application in
`applyProviderDefaults`.
```json
{
"groq-kimi-k2": {
"display:name": "Groq - Kimi K2",
"ai:provider": "groq",
"ai:model": "moonshotai/kimi-k2-instruct"
}
}
```
<!-- START COPILOT CODING AGENT TIPS -->
---
💡 You can make Copilot smarter by setting up custom instructions,
customizing its development environment and configuring Model Context
Protocol (MCP) servers. Learn more [Copilot coding agent
tips](https://gh.io/copilot-coding-agent-tips) in the docs.
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: sawka <2722291+sawka@users.noreply.github.com>
Copy file name to clipboardExpand all lines: docs/docs/waveai-modes.mdx
+26-2Lines changed: 26 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,6 +35,7 @@ Wave AI now supports provider-based configuration which automatically applies se
35
35
-**`openai`** - OpenAI API (automatically configures endpoint and secret name) [[see example](#openai)]
36
36
-**`openrouter`** - OpenRouter API (automatically configures endpoint and secret name) [[see example](#openrouter)]
37
37
-**`nanogpt`** - NanoGPT API (automatically configures endpoint and secret name) [[see example](#nanogpt)]
38
+
-**`groq`** - Groq API (automatically configures endpoint and secret name) [[see example](#groq)]
38
39
-**`google`** - Google AI (Gemini) [[see example](#google-ai-gemini)]
39
40
-**`azure`** - Azure OpenAI Service (modern API) [[see example](#azure-openai-modern-api)]
40
41
-**`azure-legacy`** - Azure OpenAI Service (legacy deployment API) [[see example](#azure-openai-legacy-deployment-api)]
@@ -295,6 +296,29 @@ NanoGPT is a proxy service that provides access to multiple AI models. You must
295
296
For vision-capable models like `openai/gpt-5`, add `"images"` to capabilities.
296
297
:::
297
298
299
+
### Groq
300
+
301
+
[Groq](https://groq.com) provides fast inference for open models through an OpenAI-compatible API. Using the `groq` provider simplifies configuration:
302
+
303
+
```json
304
+
{
305
+
"groq-kimi-k2": {
306
+
"display:name": "Groq - Kimi K2",
307
+
"ai:provider": "groq",
308
+
"ai:model": "moonshotai/kimi-k2-instruct"
309
+
}
310
+
}
311
+
```
312
+
313
+
The provider automatically sets:
314
+
-`ai:endpoint` to `https://api.groq.com/openai/v1/chat/completions`
315
+
-`ai:apitype` to `openai-chat`
316
+
-`ai:apitokensecretname` to `GROQ_KEY` (store your Groq API key with this name)
317
+
318
+
:::note
319
+
For Groq, you must manually specify `ai:capabilities` based on your model's features.
320
+
:::
321
+
298
322
### Google AI (Gemini)
299
323
300
324
[Google AI](https://ai.google.dev) provides the Gemini family of models. Using the `google` provider simplifies configuration:
@@ -508,7 +532,7 @@ If you get "model not found" errors:
508
532
|`display:order`| No | Sort order in the selector (lower numbers first) |
509
533
|`display:icon`| No | Icon identifier for the mode (can use any [FontAwesome icon](https://fontawesome.com/search), use the name without the "fa-" prefix). Default is "sparkles" |
510
534
|`display:description`| No | Full description of the mode |
|`ai:apitype`| No | API type: `openai-chat`, `openai-responses`, or `google-gemini` (defaults to `openai-chat` if not specified) |
513
537
|`ai:model`| No | Model identifier (required for most providers) |
514
538
|`ai:thinkinglevel`| No | Thinking level: `low`, `medium`, or `high`|
@@ -532,7 +556,7 @@ The `ai:capabilities` field specifies what features the AI mode supports:
532
556
533
557
**Provider-specific behavior:**
534
558
-**OpenAI and Google providers**: Capabilities are automatically configured based on the model. You don't need to specify them.
535
-
-**OpenRouter, Azure, Azure-Legacy, and Custom providers**: You must manually specify capabilities based on your model's features.
559
+
-**OpenRouter, NanoGPT, Groq, Azure, Azure-Legacy, and Custom providers**: You must manually specify capabilities based on your model's features.
536
560
537
561
:::warning
538
562
If you don't include `"tools"` in the `ai:capabilities` array, the AI model will not be able to interact with your Wave terminal widgets, read/write files, or execute commands. Most AI modes should include `"tools"` for the best Wave experience.
0 commit comments