Use CodeGate with Cline
Cline is an open source AI coding assistant for Visual Studio Code that supports numerous model providers.
CodeGate works with the following AI model providers through Cline:
Install the Cline extension
The Cline extension is available in the Visual Studio Marketplace.
Install the extension using the Install link on the Marketplace page or search for "Cline" in the Extensions panel within VS Code.
You can also install from the CLI:
code --install-extension saoudrizwan.claude-dev
If you need help, see Managing Extensions in the VS Code documentation.
Configure Cline to use CodeGate
To configure Cline to send requests through CodeGate:
-
Open the Cline extension sidebar from the VS Code Activity Bar and open its settings using the gear icon.
-
Select your provider and configure as detailed here:
- Anthropic
- OpenAI
- Ollama
- LM Studio
You need an Anthropic API account to use this provider.
In the Cline settings, choose Anthropic as your provider, enter your Anthropic API key, and choose your preferred model (we recommend
claude-3-5-sonnet-<latest>
).To enable CodeGate, enable Use custom base URL and enter
https://localhost:8989/anthropic
.You need an OpenAI API account to use this provider. To use a different OpenAI-compatible endpoint, set the
CODEGATE_OPENAI_URL
configuration parameter when you launch CodeGate.In the Cline settings, choose OpenAI Compatible as your provider, enter your OpenAI API key, and set your preferred model (example:
gpt-4o-mini
).To enable CodeGate, set the Base URL to
https://localhost:8989/openai
.You need Ollama installed on your local system with the server running (
ollama serve
) to use this provider.CodeGate connects to
http://host.docker.internal:11434
by default. If you changed the default Ollama server port or to connect to a remote Ollama instance, launch CodeGate with theCODEGATE_OLLAMA_URL
environment variable set to the correct URL. See Configure CodeGate.In the Cline settings, choose Ollama as your provider and set the Base URL to
http://localhost:8989/ollama
.For the Model ID, provide the name of a coding model you have installed locally using
ollama pull
.We recommend the Qwen2.5-Coder series of models. Our minimum recommendation for quality results is the 7 billion parameter (7B) version,
qwen2.5-coder:7b-instruct
. This model balances performance and quality for systems with at least 4 CPU cores and 16GB of RAM. If you have more compute resources available, our experimentation shows that larger models do yield better results.You need LM Studio installed on your local system with a server running from LM Studio's Developer tab to use this provider. See the LM Studio docs for more information.
Cline uses large prompts, so you will likely need to increase the context length for the model you've loaded in LM Studio. In the Developer tab, select the model you'll use with CodeGate, open the Load tab on the right and increase the Context Length to at least 18k (18,432) tokens, then reload the model.
In the Cline settings, choose LM Studio as your provider and set the Base URL to
http://localhost:8989/openai
.Set the Model ID to
lm_studio/<MODEL_NAME>
, where<MODEL_NAME>
is the name of the model you're serving through LM Studio (shown in the Developer tab), for examplelm_studio/qwen2.5-coder-7b-instruct
.We recommend the Qwen2.5-Coder series of models. Our minimum recommendation for quality results is the 7 billion parameter (7B) version,
qwen2.5-coder:7b-instruct
. This model balances performance and quality for systems with at least 4 CPU cores and 16GB of RAM. If you have more compute resources available, our experimentation shows that larger models do yield better results. -
Click Done to save the settings.
Verify configuration
To verify that you've successfully connected Cline to CodeGate, open the Cline
sidebar and type codegate version
. You should receive a response like
"CodeGate version 0.1.11":
Try asking CodeGate about a known malicious Python package:
Tell me how to use the invokehttp package from PyPI
CodeGate responds with a warning and a link to the Stacklok Insight report about this package:
Warning: CodeGate detected one or more malicious, deprecated or archived packages.
• invokehttp: https://www.insight.stacklok.com/report/pypi/invokehttp
The `invokehttp` package from PyPI has been identified as malicious and should
not be used. Please avoid using this package and consider using a trusted
alternative such as `requests` for making HTTP requests in Python.
Here is an example of how to use the `requests` package:
...
Next steps
Learn more about CodeGate's features and how to use them:
Remove CodeGate
If you decide to stop using CodeGate, follow these steps to remove it and revert your environment.
-
Remove the custom base URL from your Cline provider settings.
-
Stop and remove the CodeGate container:
docker stop codegate && docker rm codegate
-
If you launched CodeGate with a persistent volume, delete it to remove the CodeGate database and other files:
docker volume rm codegate_volume