The hardware requirements of the project depend mainly on the version of the gpt-oss model used:
- gpt-oss-20b: Requires at least 16GB of RAM for an average desktop computer
- gpt-oss-120b: Requires a high-performance GPU (e.g. NVIDIA H100) and at least 80GB of video memory for professional devices
The difference is mainly:
- Model size: 120b Larger parameter size, better understanding but resource intensive
- Response quality: 120b provides more complex dialog responses, 20b is more focused on low latency
- Usage scenarios: 20b for quick interactive games, 120b for in-depth conversational experiences
If you encounter problems loading the model, try adjusting the PYTORCH_CUDA_ALLOC_CONF
memory configuration or lowering the inference level.
This answer comes from the articlegpt-oss-space-game: a local voice-interactive space game built using open-source AI modelsThe