prescription
To simplify the process of switching between different language models, you can use the Any-LLM Unified Interface Tool by following these steps:
- Standardized Model Call Format: All model calls are made using the
completion()
function, you only need to modify themodel
parameter (in the formatprovider/model_id
(You can switch between models.) - Centralized management of API keys: Uniformly configure keys across platforms via environment variables (e.g.
export OPENAI_API_KEY='your_key'
) to avoid duplicating the setup at each call - Encapsulation of generic functions: Create standard call templates that include error handling, such as preset
temperature
and other commonly used parameters to reduce repetitive code
Typical implementation example:def query_llm(question, model_name='mistral/mistral-small-latest'):
try:
return completion(
model=model_name,
messages=[{'role':'user','content':question}],
temperature=0.8
)
except Exception as e:
print(f'调用失败: {e}')
This method can reduce the cost of model switching above 90%, which is especially suitable for scenarios that require frequent testing of the effects of different models.
This answer comes from the articleAny-LLM: Open Source Tool for Unified Interface Invocation of Multilingual ModelsThe