The repository offers two models:gpt-oss-120b
(117B parameters) andgpt-oss-20b
(21B parameters). Hardware conditions need to be considered when selecting a model:
- High-performance GPUs (e.g. H100): Recommended use
gpt-oss-120b
, requires at least 80GB of GPU memory. - Consumer-grade hardware (e.g., 16GB RAM devices): Recommended use
gpt-oss-20b
The resource footprint is low.
This is done in the script by modifying themodel_path
Variable selection models, for example:
model_path = "openai/gpt-oss-20b" # 选择20B模型
# model_path = "openai/gpt-oss-120b" # 选择120B模型
The script automatically configures device mapping and optimization settings based on model size.
This answer comes from the articleCollection of scripts and tutorials for fine-tuning OpenAI GPT OSS modelsThe