AgentVerse is compatible with current mainstream large language modeling services and local deployment solutions. The cloud supports OpenAI API (e.g., GPT-3.5) and Azure OpenAI service; the local model supports open source models such as LLaMA, Vicuna, etc. The project provides a complete support chain for local deployment, including dedicated dependency installation (requirements_local.txt) and local server startup. For local deployment, the project provides a complete support chain, including dedicated dependency installation (requirements_local.txt) and local server startup script (run_local_model_server.sh). Users have the flexibility to switch between model types by modifying the YAML configuration file, e.g. by setting the llm_type
Setting it to local invokes the locally deployed LLaMA-2-7B model. In addition, the project also integrates vLLM support to handle large-scale inference tasks.
This answer comes from the articleAgentVerse: An Open Source Framework for Deploying Multi-Intelligence Collaboration and SimulationThe