Overseas access: www.kdjingpai.com
Bookmark Us
Current Position:fig. beginning " AI Answers

How do I configure Crush to use my local Ollama model?

2025-08-19 324

The following steps are required to configure a local Ollama model:

  1. Ensure that the Ollama service is started (default port 11434)
  2. Creating Configuration Files ~/.crush/config.json, add the following:
    {
    "providers": {
    "ollama": {
    "type": "openai",
    "base_url": "http://127.0.0.1:11434/v1",
    "api_key": "ollama"
    }
    }
    }
  3. (of a computer) run crush model switch ollama Switching Models
  4. transferring entity crush code Test Code Generation Function

This configuration allows Crush to run completely offline for privacy-sensitive scenarios.

Recommended

Can't find AI tools? Try here!

Just type in the keyword Accessibility Bing SearchYou can quickly find all the AI tools on this site.

Top

en_USEnglish