Overseas access: www.kdjingpai.com
Bookmark Us
Current Position:fig. beginning " AI Answers

How can I use a local large language model in Sim?

2025-08-19 145

The following steps are required to integrate a local large language model using Sim:

  1. First pull the desired model through the script:./apps/sim/scripts/ollama_docker.sh pull <model_name>
  2. Select the boot method according to the hardware environment:
    • GPU environment:docker compose --profile local-gpu -f docker-compose.ollama.yml up -d
    • CPU environment:docker compose --profile local-cpu -f docker-compose.ollama.yml up -d
  3. Selecting pulled local models in the workflow configuration
  4. Specify GPU or CPU mode as required
  5. Test model response to ensure workflow is working properly

Note that local models require larger storage space and computational resources, especially since GPU environments provide better performance.

Can't find AI tools? Try here!

Just type in the keyword Accessibility Bing SearchYou can quickly find all the AI tools on this site.

Top

en_USEnglish