How to Run Claude Code with Ollama Models

# install Ollama
curl -fsSL https://ollama.com/install.sh | sh # for  Linux

# Pull the desired model
Ollama pull glm-4.7-flash:latest # replace with your model

# Check the model list
Ollama ls

# Install Claude Code
curl -fsSL https://claude.ai/install.sh | bash

# Connect Ollama
export ANTHROPIC_AUTH_TOKEN=ollama
export ANTHROPIC_BASE_URL=http://localhost:11434

# Reload the shell
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc

# Check version
claude --version

# Run Claude Code with an Ollama model
claude --model glm-4.7-flash:latest

Enjoy your powerful agentic coding with Claude Code and Ollama models for free!

Leave a Reply

Your email address will not be published. Required fields are marked *