AI Assistant supports the following providers:
OpenAI (openai)
Google AI (google)
GitHub Copilot (copilot)
Ollama (ollama)
The fastest way to set-up most of the providers is through specific environment variables:
OpenAI
OPENAI_API_KEY (required) OPENAI_ORG_ID (optional) OPENAI_PROJECT_ID (optional) OPENAI_BASE_URL (optional)
Google AI
Ollama
GitHub Copilot is available only in VS Code. The process to set up GitHub Copilot in VS Code involves the following steps:
Install the GitHub Copilot Chat extension. Sign-in with your GitHub account. When asked, allow DVT to use the Copilot features.
AI Assistant allows you to refine the configuration for advanced use cases, for example: To change the selection of models presented in the AI Assistant’s interface. To use advanced options with the LLM provider (timeout, max_retries, …). To change the LLM parameters (temperature, top_p, …).
To accomplish this, the configuration needs to be written in a JSON file using the following structure: "models":[{
"provider": "openai",
"model": "gpt-4*",
"modelOptions": {
"temperature": 0.3
...
},
"providerOptions": {
"apiKey": "..."
...
}
}]
The
"models" array may contain any number of configurations.
The
"modeOptions" and
"providerOptions" are specific to each model and provider. You should refer to vendor documentation to see the available options.
AI Assistant looks for the JSON configuration in these locations: |