{"id":18919,"date":"2025-01-20T16:51:57","date_gmt":"2025-01-20T08:51:57","guid":{"rendered":"https:\/\/www.aisharenet.com\/?p=18919"},"modified":"2025-02-08T23:30:14","modified_gmt":"2025-02-08T15:30:14","slug":"unsloth","status":"publish","type":"post","link":"https:\/\/www.kdjingpai.com\/en\/unsloth\/","title":{"rendered":"Unsloth\uff1a\u9ad8\u6548\u5fae\u8c03\u548c\u8bad\u7ec3\u5927\u8bed\u8a00\u6a21\u578b\u7684\u5f00\u6e90\u5de5\u5177"},"content":{"rendered":"<p>Unsloth \u662f\u4e00\u4e2a\u5f00\u6e90\u9879\u76ee\uff0c\u65e8\u5728\u63d0\u4f9b\u9ad8\u6548\u7684\u5fae\u8c03\u548c\u8bad\u7ec3\u5927\u8bed\u8a00\u6a21\u578b\uff08LLMs\uff09\u7684\u5de5\u5177\u3002\u8be5\u9879\u76ee\u652f\u6301\u591a\u79cd\u77e5\u540d\u6a21\u578b\uff0c\u5305\u62ec Llama\u3001Mistral\u3001Phi \u548c Gemma \u7b49\u3002Unsloth \u7684\u4e3b\u8981\u7279\u70b9\u662f\u80fd\u591f\u663e\u8457\u51cf\u5c11\u5185\u5b58\u4f7f\u7528\u548c\u52a0\u5feb\u8bad\u7ec3\u901f\u5ea6\uff0c\u4f7f\u5f97\u7528\u6237\u53ef\u4ee5\u5728\u66f4\u77ed\u7684\u65f6\u95f4\u5185\u5b8c\u6210\u6a21\u578b\u7684\u5fae\u8c03\u548c\u8bad\u7ec3\u3002\u6b64\u5916\uff0cUnsloth \u63d0\u4f9b\u4e86\u4e30\u5bcc\u7684\u6587\u6863\u548c\u6559\u7a0b\uff0c\u5e2e\u52a9\u7528\u6237\u5feb\u901f\u4e0a\u624b\u5e76\u5145\u5206\u5229\u7528\u5176\u529f\u80fd\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-18920\" title=\"Unsloth Zoo\uff1a\u63d0\u4f9b\u514d\u8d39\u5927\u6a21\u578b\u5fae\u8c03\u5de5\u5177\u5e93\uff0c\u63d0\u5347\u6a21\u578b\u6027\u80fd-1\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/20f9934fd213804.png\" alt=\"Unsloth Zoo\uff1a\u63d0\u4f9b\u514d\u8d39\u5927\u6a21\u578b\u5fae\u8c03\u5de5\u5177\u5e93\uff0c\u63d0\u5347\u6a21\u578b\u6027\u80fd-1\" width=\"691\" height=\"466\" srcset=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/20f9934fd213804.png 691w, https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/20f9934fd213804-300x202.png 300w, https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/20f9934fd213804-18x12.png 18w\" sizes=\"auto, (max-width: 691px) 100vw, 691px\" \/><\/p>\n<p>&nbsp;<\/p>\n<h2>\u529f\u80fd\u5217\u8868<\/h2>\n<ul>\n<li><strong>\u9ad8\u6548\u5fae\u8c03<\/strong>\uff1a\u652f\u6301 Llama\u3001Mistral\u3001Phi \u548c Gemma \u7b49\u591a\u79cd\u6a21\u578b\uff0c\u5fae\u8c03\u901f\u5ea6\u63d0\u9ad8 2-5 \u500d\uff0c\u5185\u5b58\u4f7f\u7528\u51cf\u5c11 50-80%\u3002<\/li>\n<li><strong>\u514d\u8d39\u4f7f\u7528<\/strong>\uff1a\u63d0\u4f9b\u514d\u8d39\u4f7f\u7528\u7684\u7b14\u8bb0\u672c\uff0c\u7528\u6237\u53ea\u9700\u6dfb\u52a0\u6570\u636e\u96c6\u5e76\u8fd0\u884c\u6240\u6709\u4ee3\u7801\uff0c\u5373\u53ef\u83b7\u5f97\u5fae\u8c03\u540e\u7684\u6a21\u578b\u3002<\/li>\n<li><strong>\u591a\u79cd\u5bfc\u51fa\u683c\u5f0f<\/strong>\uff1a\u652f\u6301\u5c06\u5fae\u8c03\u540e\u7684\u6a21\u578b\u5bfc\u51fa\u4e3a GGUF\u3001Ollama\u3001vLLM \u6216\u4e0a\u4f20\u5230 Hugging Face\u3002<\/li>\n<li><strong>\u52a8\u6001\u91cf\u5316<\/strong>\uff1a\u652f\u6301\u52a8\u6001 4-bit \u91cf\u5316\uff0c\u63d0\u9ad8\u6a21\u578b\u7cbe\u5ea6\uff0c\u540c\u65f6\u4ec5\u589e\u52a0\u4e0d\u5230 10% \u7684\u663e\u5b58\u4f7f\u7528\u3002<\/li>\n<li><strong>\u957f\u4e0a\u4e0b\u6587\u652f\u6301<\/strong>\uff1a\u652f\u6301 Llama 3.3 (70B) \u6a21\u578b\u7684 89K \u4e0a\u4e0b\u6587\u7a97\u53e3\uff0c\u4ee5\u53ca Llama 3.1 (8B) \u6a21\u578b\u7684 342K \u4e0a\u4e0b\u6587\u7a97\u53e3\u3002<\/li>\n<li><strong>\u89c6\u89c9\u6a21\u578b\u652f\u6301<\/strong>\uff1a\u652f\u6301 Llama 3.2 Vision (11B)\u3001Qwen 2.5 VL (7B) \u548c Pixtral (12B) \u7b49\u89c6\u89c9\u6a21\u578b\u3002<\/li>\n<li><strong>\u63a8\u7406\u4f18\u5316<\/strong>\uff1a\u63d0\u4f9b\u591a\u79cd\u63a8\u7406\u4f18\u5316\u9009\u9879\uff0c\u663e\u8457\u63d0\u9ad8\u63a8\u7406\u901f\u5ea6\u3002<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h2>\u4f7f\u7528\u5e2e\u52a9<\/h2>\n<h3>\u5b89\u88c5\u6d41\u7a0b<\/h3>\n<ol>\n<li><strong>\u5b89\u88c5\u4f9d\u8d56<\/strong>\uff1a\u786e\u4fdd\u5df2\u5b89\u88c5 Python 3.8 \u53ca\u4ee5\u4e0a\u7248\u672c\uff0c\u5e76\u5b89\u88c5\u4ee5\u4e0b\u4f9d\u8d56\uff1a <code>bash<br \/>\npip install torch transformers datasets<br \/>\n<\/code><\/li>\n<li><strong>\u514b\u9686\u4ed3\u5e93<\/strong>\uff1a\u4f7f\u7528 Git \u514b\u9686 Unsloth \u4ed3\u5e93\uff1a <code>bash<br \/>\ngit clone https:\/\/github.com\/unslothai\/unsloth.git<br \/>\ncd unsloth<br \/>\n<\/code><\/li>\n<li><strong>\u5b89\u88c5 Unsloth<\/strong>\uff1a\u8fd0\u884c\u4ee5\u4e0b\u547d\u4ee4\u5b89\u88c5 Unsloth\uff1a <code>bash<br \/>\npip install -e .<br \/>\n<\/code><\/li>\n<\/ol>\n<h3>\u4f7f\u7528\u6559\u7a0b<\/h3>\n<ol>\n<li><strong>\u52a0\u8f7d\u6a21\u578b<\/strong>\uff1a\u5728 Python \u811a\u672c\u4e2d\u52a0\u8f7d\u9884\u8bad\u7ec3\u6a21\u578b\uff1a\n<pre><code>from transformers import AutoModelForCausalLM, AutoTokenizer\r\nmodel_name = \"unslothai\/llama-3.3\"\r\nmodel = AutoModelForCausalLM.from_pretrained(model_name)\r\ntokenizer = AutoTokenizer.from_pretrained(model_name)\r\n<\/code><\/pre>\n<\/li>\n<li><strong>\u5fae\u8c03\u6a21\u578b<\/strong>\uff1a\u4f7f\u7528 Unsloth \u63d0\u4f9b\u7684\u7b14\u8bb0\u672c\u8fdb\u884c\u6a21\u578b\u5fae\u8c03\u3002\u4ee5\u4e0b\u662f\u4e00\u4e2a\u7b80\u5355\u7684\u793a\u4f8b\uff1a\n<pre><code>from unsloth import Trainer, TrainingArguments\r\ntraining_args = TrainingArguments(\r\noutput_dir=\".\/results\",\r\nnum_train_epochs=3,\r\nper_device_train_batch_size=4,\r\nsave_steps=10_000,\r\nsave_total_limit=2,\r\n)\r\ntrainer = Trainer(\r\nmodel=model,\r\nargs=training_args,\r\ntrain_dataset=train_dataset,\r\neval_dataset=eval_dataset,\r\n)\r\ntrainer.train()\r\n<\/code><\/pre>\n<\/li>\n<li><strong>\u5bfc\u51fa\u6a21\u578b<\/strong>\uff1a\u5fae\u8c03\u5b8c\u6210\u540e\uff0c\u53ef\u4ee5\u5c06\u6a21\u578b\u5bfc\u51fa\u4e3a\u591a\u79cd\u683c\u5f0f\uff1a <code>python<br \/>\nmodel.save_pretrained(\".\/finetuned_model\")<br \/>\ntokenizer.save_pretrained(\".\/finetuned_model\")<br \/>\n<\/code><\/li>\n<\/ol>\n<h3>\u8be6\u7ec6\u529f\u80fd\u64cd\u4f5c<\/h3>\n<ul>\n<li><strong>\u52a8\u6001\u91cf\u5316<\/strong>\uff1a\u5728\u5fae\u8c03\u8fc7\u7a0b\u4e2d\uff0cUnsloth \u652f\u6301\u52a8\u6001 4-bit \u91cf\u5316\uff0c\u8fd9\u53ef\u4ee5\u663e\u8457\u63d0\u9ad8\u6a21\u578b\u7684\u7cbe\u5ea6\uff0c\u540c\u65f6\u4ec5\u589e\u52a0\u4e0d\u5230 10% \u7684\u663e\u5b58\u4f7f\u7528\u3002\u7528\u6237\u53ef\u4ee5\u5728\u8bad\u7ec3\u53c2\u6570\u4e2d\u542f\u7528\u6b64\u529f\u80fd\uff1a\n<pre><code>training_args = TrainingArguments(\r\noutput_dir=\".\/results\",\r\nnum_train_epochs=3,\r\nper_device_train_batch_size=4,\r\nsave_steps=10_000,\r\nsave_total_limit=2,\r\nquantization=\"dynamic_4bit\"\r\n)\r\n<\/code><\/pre>\n<\/li>\n<li><strong>\u957f\u4e0a\u4e0b\u6587\u652f\u6301<\/strong>\uff1aUnsloth \u652f\u6301 Llama 3.3 (70B) \u6a21\u578b\u7684 89K \u4e0a\u4e0b\u6587\u7a97\u53e3\uff0c\u4ee5\u53ca Llama 3.1 (8B) \u6a21\u578b\u7684 342K \u4e0a\u4e0b\u6587\u7a97\u53e3\u3002\u8fd9\u4f7f\u5f97\u6a21\u578b\u5728\u5904\u7406\u957f\u6587\u672c\u65f6\u8868\u73b0\u66f4\u52a0\u51fa\u8272\u3002\u7528\u6237\u53ef\u4ee5\u5728\u52a0\u8f7d\u6a21\u578b\u65f6\u6307\u5b9a\u4e0a\u4e0b\u6587\u7a97\u53e3\u5927\u5c0f\uff1a\n<pre><code>model = AutoModelForCausalLM.from_pretrained(model_name, context_window=89000)\r\n<\/code><\/pre>\n<\/li>\n<li><strong>\u89c6\u89c9\u6a21\u578b\u652f\u6301<\/strong>\uff1aUnsloth \u8fd8\u652f\u6301\u591a\u79cd\u89c6\u89c9\u6a21\u578b\uff0c\u5982 Llama 3.2 Vision (11B)\u3001Qwen 2.5 VL (7B) \u548c Pixtral (12B)\u3002\u7528\u6237\u53ef\u4ee5\u4f7f\u7528\u8fd9\u4e9b\u6a21\u578b\u8fdb\u884c\u56fe\u50cf\u751f\u6210\u548c\u5904\u7406\u4efb\u52a1\uff1a <code>python<br \/>\nmodel_name = \"unslothai\/llama-3.2-vision\"<br \/>\nmodel = AutoModelForImageGeneration.from_pretrained(model_name)<br \/>\n<\/code><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Unsloth \u662f\u4e00\u4e2a\u5f00\u6e90\u9879\u76ee\uff0c\u65e8\u5728\u63d0\u4f9b\u9ad8\u6548\u7684\u5fae\u8c03\u548c\u8bad\u7ec3\u5927\u8bed\u8a00\u6a21\u578b\uff08LLMs\uff09\u7684\u5de5\u5177\u3002\u8be5\u9879\u76ee\u652f\u6301\u591a\u79cd\u77e5\u540d\u6a21\u578b\uff0c\u5305\u62ec Llama\u3001Mistral\u3001Phi \u548c Gemma \u7b49\u3002Unsloth \u7684\u4e3b\u8981\u7279\u70b9\u662f\u80fd\u591f\u663e\u8457\u51cf\u5c11\u5185\u5b58\u4f7f\u7528\u548c\u52a0\u5feb\u8bad\u7ec3\u901f\u5ea6\uff0c\u4f7f\u5f97&#8230;<\/p>\n","protected":false},"author":1,"featured_media":61683,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[230,365],"class_list":["post-18919","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tool","tag-aikaiyuanxiangmu","tag-damoxingweidiao"],"_links":{"self":[{"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/posts\/18919","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/comments?post=18919"}],"version-history":[{"count":0,"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/posts\/18919\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/media\/61683"}],"wp:attachment":[{"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/media?parent=18919"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/categories?post=18919"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kdjingpai.com\/en\/wp-json\/wp\/v2\/tags?post=18919"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}