For development scenarios without a networked environment, Crush offers a complete offline solution:
- Local Model Integration: Support for native LLMs such as Ollama, configuration example:
'base_url': 'http://127.0.0.1:11434/v1'
- Offline LSP Services: Pre-installed language servers such as
gopls
Provide basic intelligence support - context cache (computing): Session management feature to keep a history of interactions
- Preloaded Configurations: Include common dependencies when packaging via Nix/Homebrew
In environments such as airplanes and classified sites, developers still have access to full code generation, completion and debugging capabilities, and all data processing is done locally.
This answer comes from the articleCrush: endpoint AI programming assistant with integrated LSP and multi-model switchingThe