Overseas access: www.kdjingpai.com
Bookmark Us
Current Position:fig. beginning " AI Answers

What types of large language models does MaxKB support docking? How to configure a local private model?

2025-09-10 2.0 K

MaxKB adopts an open model docking architecture and supports three main types of model access:

  • Public Cloud ModelIncluding OpenAI GPT series, Anthropic Claude, Smart Spectrum AI and other domestic and international mainstream APIs.
  • open source local model: Support for self-hosted models such as Llama 3, ChatGLM3, Qwen, etc. accessed via Ollama or vLLM
  • Enterprise-specific models: Compatible with any localized deployment model that conforms to the OpenAI API specification

Specific steps for configuring a local private model:

  1. Select the "Customize Model" option in the Model Management module.
  2. Fill in the model API address (e.g. http://localhost:11434/api/generate)
  3. Set parameters such as model name and context length
  4. Save configuration after performing connectivity tests

The system also supports model hot-switching and AB testing, which can automatically assign the most suitable model according to different business scenarios. For example, customer service scenarios can use the lower-cost 7B small model, while technical document parsing can be switched to a large model with 70B parameters.

Recommended

Can't find AI tools? Try here!

Just type in the keyword Accessibility Bing SearchYou can quickly find all the AI tools on this site.

Top