Models
Cyclops Code uses LiteLLM to call any model. Switch providers by changing one string and setting the matching API key.
Providers
Section titled “Providers”Anthropic
Section titled “Anthropic”export ANTHROPIC_API_KEY="sk-ant-..."cyclops --model anthropic/claude-opus-4-5cyclops --model anthropic/claude-haiku-4-5-20251001OpenAI
Section titled “OpenAI”export OPENAI_API_KEY="sk-..."cyclops --model gpt-4ocyclops --model gpt-4o-miniGroq offers a generous free tier and fast inference.
export GROQ_API_KEY="gsk_..."cyclops --model groq/llama-3.3-70b-versatilecyclops --model groq/llama-3.1-8b-instantGoogle Gemini
Section titled “Google Gemini”export GEMINI_API_KEY="..."cyclops --model gemini/gemini-2.0-flashcyclops --model gemini/gemini-1.5-proOllama (local)
Section titled “Ollama (local)”No API key needed. Install Ollama, pull a model, then run:
ollama pull qwen2.5-coder:7bcyclops --model ollama/qwen2.5-coder:7bGood models for coding tasks: qwen2.5-coder:7b, qwen2.5-coder:14b, deepseek-coder-v2.
Together AI
Section titled “Together AI”export TOGETHERAI_API_KEY="..."cyclops --model together_ai/meta-llama/Meta-Llama-3.1-70B-Instruct-TurboSwitching models
Section titled “Switching models”Pass --model on any invocation to override the default for that run:
cyclops --model gpt-4o "review this PR"Switch mid-session with /model in the REPL. Type the model name or use the interactive picker with search and autocomplete.
Setting a default
Section titled “Setting a default”The default model is stored in ~/.cyclops/config.json:
{ "model": "anthropic/claude-haiku-4-5-20251001"}On first launch Cyclops Code will prompt you to pick a default.
Model strings
Section titled “Model strings”LiteLLM model strings follow the pattern provider/model-name. See the LiteLLM provider docs for the full list of supported models and their string formats.