How to Configure Ollama with Continue
Discover Ollama models here
Get started with Ollama
Configuration
name: My Config
version: 0.0.1
schema: v1
models:
- name: <MODEL_NAME>
provider: ollama
model: <MODEL_ID>
apiBase: http://<my endpoint>:11434 # if running a remote instance of Ollama
Check out a more advanced configuration here
How to Configure Model Capabilities in Ollama
Ollama models usually have their capabilities auto-detected correctly. However, if you're using custom model names or experiencing issues with tools/images not working, you can explicitly set capabilities:
name: My Config
version: 0.0.1
schema: v1
models:
- name: <CUSTOM_MODEL_NAME>
provider: ollama
model: <CUSTOM_MODEL_ID>
capabilities:
- tool_use # Enable if your model supports function calling
- image_input # Enable for vision models
Many Ollama models support tool use by default. Vision models often also support image input