Overseas access: www.kdjingpai.com
Bookmark Us
Current Position:fig. beginning » AI News

Leaked Microsoft paper: only 8B for GPT-4o-mini and 100B for o1-mini?

2025-01-02 2.1 K

There has been an ongoing discussion about the parameter sizes of mainstream closed-source LLMs, and in the last 2 days of 2024 an article from Microsoft about theDetection and correction of medical errors in clinical notesstandard of referenceThe MEDEC study accidentally and directly missed the scale of their parameters:o1-preview, GPT-4.GPT-4o andClaude 3.5 Sonnet.

Paper address: https://arxiv.org/pdf/2412.19260v1

微软说:GPT-4o-mini只有8B,o1-mini仅100B?-1

The experimental part of the experiment also divides the large model parameter scales into 3 blocks:7-8B, ~100-300B, ~1.7Tbut (not)GPT-4o-miniBeing placed in the first slot with only 8B is a bit unbelievable.

 

summarize

微软说:GPT-4o-mini只有8B,o1-mini仅100B?-1

 

  • Claude 3.5 Sonnet (2024-10-22), ~175B
  • ChatGPT, ~175B
  • GPT-4, approximately 1.76T
  • GPT-4o, ~200B
  • GPT-4o-mini (gpt-4o-2024-05-13) only 8B
  • Latest o1-mini (o1-mini-2024-09-12) only 100B
  • o1-preview (o1-preview-2024-09-12) ~300B
🍐 Duck & Pear AI Article Smart Writer
Selection → Writing → Publishing
Fully automated!
WordPress AI Writing Plugin
500+ content creators are using
🎯Intelligent Selection: Batch generation, say goodbye to exhaustion
🧠retrieval enhancement: networking + knowledge base with depth
Fully automated: Writing → Mapping → Publishing
💎Permanently free: Free version = Paid version, no limitations
🔥 Download the plugin for free now!
✅ Free forever · 🔓 100% Open Source · 🔒 Local storage of data

Recommended

Can't find AI tools? Try here!

Enter keywords.Accessibility to Bing SearchYou can find AI tools on this site quickly.

Top