Ollama Provider
Run local open-source models via Ollama with zero API costs and complete privacy, or use Ollama Cloud for hosted models.Quick Start
Variants
Ollama (Local)
Run models on your local machine with complete privacy and zero API costs. Configuration:- Base URL:
http://localhost:11434 - Default model:
qwen3-coder - API key: Not required
- Context: 32K+ tokens
Ollama Cloud
Access Ollama’s hosted models via their cloud API. Configuration:- Base URL:
https://ollama.com - Default models:
glm-4.7:cloud,minimax-m2.1:cloud - API key: Required from ollama.com
- Context: Varies by model
Prerequisites
Installing Ollama (Local)
Download Ollama
Visit ollama.com and download for your platform
Ollama Cloud Setup
Create Account
Sign up at ollama.com
Configuration
Local Ollama Setup
Ollama Cloud Setup
Model Selection
Popular Local Models
| Model | Size | Context | Use Case |
|---|---|---|---|
qwen3-coder | 7B | 32K | Coding (recommended) |
deepseek-coder | 6.7B | 16K | Code completion |
codellama | 7B | 16K | Code generation |
mistral | 7B | 8K | General purpose |
Pulling Models
Cloud Models
| Model | Description |
|---|---|
glm-4.7:cloud | GLM via Ollama Cloud |
minimax-m2.1:cloud | Minimax via Ollama Cloud |
Usage Examples
Local Ollama
Ollama Cloud
Troubleshooting
Connection Refused
Symptom:Error: connect ECONNREFUSED 127.0.0.1:11434
Cause: Ollama service not running
Solution:
Model Not Found
Symptom:Error: model 'qwen3-coder' not found
Cause: Model not pulled locally
Solution:
Slow Responses
Symptom: Long response times Causes & Solutions:- CPU-only inference: Use smaller model or add GPU support
- Large model: Switch to smaller variant (e.g.,
qwen3-coder:3b) - Insufficient RAM: Close other apps, use quantized models
Ollama Cloud API Errors
Symptom:401 Unauthorized or 403 Forbidden
Solution:
Performance Tuning
Context Length
Concurrency
Ollama handles concurrent requests via queue. For better performance:Cost Information
| Variant | Cost | Privacy |
|---|---|---|
| Ollama (Local) | $0 (hardware only) | Complete - data never leaves machine |
| Ollama Cloud | Varies by usage | Depends on Ollama Cloud privacy policy |
Storage Locations
| Path | Description |
|---|---|
~/.ollama/models/ | Downloaded model files |
~/.ccs/config.yaml | CCS profile configuration |
~/.ccs/ollama.settings.json | Model preferences (if using Dashboard) |
