Overseas access: www.kdjingpai.com
Bookmark Us
Current Position:fig. beginning " AI Answers

Hunyuan-A13B is an efficient large language model open-sourced by Tencent's hybrid team

2025-08-23 946

Hunyuan-A13B open source background and core features

Hunyuan-A13B is a large language model officially open-sourced by Tencent's Mixed Yuan team on June 27, 2025, and is designed using the Mixed Expert (MoE) architecture. The model has 8 billion total parameters, of which 1.3 billion are active, a design that significantly reduces computational costs while ensuring strong performance. The model is open-sourced on GitHub and Hugging Face as a pre-trained model, a command fine-tuned model, and optimized quantized versions (including FP8 and GPTQ-Int4 versions) to meet the deployment needs of different hardware environments. The open source content also includes detailed training code, technical reports and operation manuals, reflecting Tencent's spirit of contribution to the AI technology sharing community.

In terms of technical realization, Hunyuan-A13B is particularly emphasized:

  • Balancing high performance and low cost: only partial parameter activation through MoE architecture
  • Comprehensive deployment options: multiple quantized versions available to accommodate different hardware
  • Complete developer support: full openness from model weights to training code

Recommended

Can't find AI tools? Try here!

Just type in the keyword Accessibility Bing SearchYou can quickly find all the AI tools on this site.

Top

en_USEnglish