You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/copilot/customization/language-models.md
+49-25Lines changed: 49 additions & 25 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -50,11 +50,9 @@ At any time, you can see which model and model multiplier are used by hovering o
50
50
51
51
## Customize the model picker
52
52
53
-
You can customize which models you want to show in the model picker:
53
+
If you want to reduce the number of built-in models that are shown in the model picker, you can customize which models you want to show:
54
54
55
-
1. Open the model picker and select **Manage Models**.
56
-
57
-
Alternatively, run the **Chat: Manage Language Models** command from the Command Palette.
55
+
1. Open the model picker and select **Manage Models** or run the **Chat: Manage Language Models** command from the Command Palette.
58
56
59
57
1. In the provider list, select **Copilot**.
60
58
@@ -70,57 +68,62 @@ To change the language model that is used for generating code completions in the
70
68
71
69
1. Select **Change Completions Model...**, and then select one of the models from the list.
72
70
71
+
> [!NOTE]
72
+
> The models that are available for code completions might evolve over time as we add support for more models.
73
+
73
74
## Bring your own language model key
74
75
75
76
> [!IMPORTANT]
76
77
> This feature is not currently available to Copilot Business or Copilot Enterprise users.
77
78
78
-
If you already have an API key for a language model provider, you can use their models in chat in VS Code, in addition to the built-in models that Copilot provides. You can use models from the following providers: Anthropic, Azure, Google Gemini, Groq, Ollama, OpenAI, OpenRouter, and xAI.
79
+
GitHub Copilot in VS Code comes with a variety of built-in language models that are optimized for different tasks. If you want to use a model that is not available as a built-in model, you can bring your own language model API key (BYOK) to use models from other providers.
80
+
81
+
Using your own language model API key in VS Code has several benefits:
79
82
80
-
Using your own language model API key in VS Code has several advantages:
83
+
***Model choice**: access hundreds of models from different providers, beyond the built-in models.
84
+
***Experimentation**: experiment with new models or features that are not yet available in the built-in models.
85
+
***Local compute**: use your own compute for one of the models already supported in GitHub Copilot or to run models not yet available.
86
+
***Greater control**: by using your own key, you can bypass the standard rate limits and restrictions imposed on the built-in models.
81
87
82
-
-**Model choice**: access hundreds of models from different providers, beyond the built-in models.
83
-
-**Experimentation**: experiment with new models or features that are not yet available in the built-in models.
84
-
-**Local compute**: use your own compute for one of the models already supported in GitHub Copilot or to run models not yet available.
85
-
-**Greater control**: by using your own key, you can bypass the standard rate limits and restrictions imposed on the built-in models.
88
+
VS Code has built-in support for several model providers and if you have a model that is compatible with the OpenAI API, you can also [configure a custom OpenAI-compatible model](#use-an-openai-compatible-model).
86
89
87
-
To manage the available models for chat:
90
+
You can add more model providers via VS Code extensions, such as [AI Toolkit for VS Code](https://aka.ms/AIToolkit), [Cerebras Inference](https://aka.ms/vscode/cerebras), [Hugging Face](https://aka.ms/vscode/huggingface), and others. You can find more of these extensions in the [Visual Studio Marketplace](https://marketplace.visualstudio.com/search?term=tag%3Alanguage-models&target=VSCode&category=All%20categories&sortBy=Relevance).
88
91
89
-
1. Select **Manage Models** from the language model picker in the Chat view.
92
+
### Configure models using your API key
90
93
91
-
Alternatively, run the **Chat: Manage Language Models** command from the Command Palette.
94
+
To configure a language model from a model provider using your own API key:
92
95
93
-

96
+
1. Select **Manage Models** from the language model picker in the Chat view or run the **Chat: Manage Language Models** command from the Command Palette.
94
97
95
-
1.Select a model provider from the list.
98
+
1.Hover over a model provider in the list, and select the gear icon to configure the provider details.
96
99
97
100

98
101
99
102
1. Enter the provider-specific details, such as the API key or endpoint URL.
100
103
101
-
1.Enter the model details or select a model from the list, if available for the provider.
104
+
1.Depending on the provider, enter the model details or select a model from the list.
102
105
103
106
The following screenshot shows the model picker for Ollama running locally, with the Phi-4 model deployed.
104
107
105
108

106
109
107
110
1. You can now select the model from the model picker in the Chat view and use it for chat conversations.
108
111
112
+
For a model to be available in [agent mode](/docs/copilot/chat/chat-agent-mode.md), it must support tool calling. If the model doesn't support tool calling, it won't be shown in the model picker when you are using agent chat mode.
113
+
109
114
### Update the provider details
110
115
111
116
To update the provider details, such as the API key or endpoint URL:
112
117
113
-
1. Select **Manage Models** from the language model picker in the Chat view.
114
-
115
-
Alternatively, run the **Chat: Manage Language Models** command from the Command Palette.
118
+
1. Select **Manage Models** from the language model picker in the Chat view or run the **Chat: Manage Language Models** command from the Command Palette.
116
119
117
120
1. Hover over a model provider in the list, and select the gear icon to edit the provider details.
118
121
119
122

120
123
121
124
1. Update the provider details, such as the API key or endpoint URL.
122
125
123
-
## Use an OpenAI-compatible model
126
+
###Use an OpenAI-compatible model
124
127
125
128
> [!NOTE]
126
129
> Configuring a custom OpenAI-compatible model is currently only available in [VS Code Insiders](https://code.visualstudio.com/insiders/) as of release 1.104.
@@ -151,10 +154,10 @@ Alternatively, you can manually add your custom model configuration in the `sett
151
154
152
155
There are a number of considerations when using your own language model API key in VS Code:
153
156
154
-
- Bringing your own model only applies to the chat experience and doesn't impact code completions or other AI-powered features in VS Code, such as commit-message generation.
155
-
- The capabilities of each model might differ from the built-in models and could affect the chat experience. For example, some models might not support vision or tool calling.
156
-
- The Copilot API is still used for some tasks, such as sending embeddings, repository indexing, query refinement, intent detection, and side queries.
157
-
- When using your own model, there is no guarantee that responsible AI filtering is applied to the model's output.
157
+
* Bringing your own model only applies to the chat experience and doesn't impact code completions or other AI-powered features in VS Code, such as commit-message generation.
158
+
* The capabilities of each model might differ from the built-in models and could affect the chat experience. For example, some models might not support vision or tool calling.
159
+
* The Copilot API is still used for some tasks, such as sending embeddings, repository indexing, query refinement, intent detection, and side queries.
160
+
* When using your own model, there is no guarantee that responsible AI filtering is applied to the model's output.
158
161
159
162
## Frequently asked questions
160
163
@@ -164,6 +167,27 @@ Bringing your own model key is not available for Copilot Business or Copilot Ent
164
167
165
168
Bringing your own model key will come to Copilot Business and Enterprise plans later this year, as we better understand the requirements that organizations have for using this functionality at scale. Copilot Business and Enterprise users can still use the built-in, managed models.
166
169
170
+
### Can I use locally hosted models with Copilot in VS Code?
171
+
172
+
You can use locally hosted models in chat by using [bring your own model key](#bring-your-own-language-model-key) (BYOK) and using a model provider that supports connecting to a local model. You have different options to connect to a local model:
173
+
174
+
* Use a built-in model provider that supports local models
175
+
* Install an extension from the [Visual Studio Marketplace](https://marketplace.visualstudio.com/search?term=tag%3Alanguage-models&target=VSCode&category=All%20categories&sortBy=Relevance), for example, [AI Toolkit for VS Code with Foundry Local](https://aka.ms/AIToolkit)
176
+
* Configure a [custom OpenAI-compatible model](#use-an-openai-compatible-model)
177
+
178
+
Currently, you cannot connect to a local model for code completions. VS Code provides an extension API [`InlineCompletionItemProvider`](/api/references/vscode-api.md#InlineCompletionItemProvider) that enables extensions to contribute a custom completion provider. You can get started with our [Inline Completions sample](https://github.com/microsoft/vscode-extension-samples/blob/main/inline-completions).
179
+
180
+
> [!NOTE]
181
+
> Currently, using a locally hosted models still requires the Copilot service for some tasks. Therefore, your GitHub account needs to have access to a Copilot plan (for example, Copilot Free) and you need to be online. This requirement might change in a future release.
182
+
183
+
### Can I use a local model without an internet connection?
184
+
185
+
Currently, using a local model requires access to the Copilot service and therefore requires you to be online. This requirement might change in a future release.
186
+
187
+
### Can I use a local model without a Copilot plan?
188
+
189
+
No, currently you need to have access to a Copilot plan (for example, Copilot Free) to use a local model. This requirement might change in a future release.
190
+
167
191
## Related resources
168
192
169
-
-[Available language models in GitHub Copilot](https://docs.github.com/en/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-chat?tool=vscode)
193
+
*[Available language models in GitHub Copilot](https://docs.github.com/en/copilot/using-github-copilot/ai-models/changing-the-ai-model-for-copilot-chat?tool=vscode)
0 commit comments