{"id":19586,"date":"2025-01-28T13:23:30","date_gmt":"2025-01-28T05:23:30","guid":{"rendered":"https:\/\/www.aisharenet.com\/?p=19586"},"modified":"2025-01-28T13:23:30","modified_gmt":"2025-01-28T05:23:30","slug":"qwen25-vl-fabu","status":"publish","type":"post","link":"https:\/\/www.kdjingpai.com\/de\/qwen25-vl-fabu\/","title":{"rendered":"Qwen2.5-VL \u53d1\u5e03\uff1a\u652f\u6301\u957f\u89c6\u9891\u7406\u89e3\u3001\u89c6\u89c9\u5b9a\u4f4d\u3001\u7ed3\u6784\u5316\u8f93\u51fa\uff0c\u5f00\u6e90\u53ef\u5fae\u8c03"},"content":{"rendered":"<h2>1.<strong>\u6a21\u578b\u4ecb\u7ecd<\/strong><\/h2>\n<p>\u81ea Qwen2-VL \u53d1\u5e03\u4ee5\u6765\u7684\u4e94\u4e2a\u6708\u91cc\uff0c\u4f17\u591a\u5f00\u53d1\u8005\u5728 Qwen2-VL \u89c6\u89c9\u8bed\u8a00\u6a21\u578b\u4e0a\u6784\u5efa\u4e86\u65b0\u6a21\u578b\uff0c\u4e3aQwen\u56e2\u961f\u63d0\u4f9b\u4e86\u5b9d\u8d35\u7684\u53cd\u9988\u3002\u5728\u6b64\u671f\u95f4\uff0cQwen\u56e2\u961f\u4e13\u6ce8\u4e8e\u6784\u5efa\u66f4\u6709\u7528\u7684\u89c6\u89c9\u8bed\u8a00\u6a21\u578b\u3002\u4eca\u5929\uff0cQwen\u56e2\u961f\u5f88\u9ad8\u5174\u5411\u5927\u5bb6\u4ecb\u7ecd Qwen \u5bb6\u65cf\u7684\u6700\u65b0\u6210\u5458\uff1aQwen2.5-VL\u3002<\/p>\n<p>&nbsp;<\/p>\n<p><strong>\u4e3b\u8981\u589e\u5f3a\u529f\u80fd\uff1a<\/strong><\/p>\n<ul>\n<li>\u89c6\u89c9\u7406\u89e3\u4e8b\u7269\uff1aQwen2.5-VL\u4e0d\u4ec5\u80fd\u591f\u719f\u7ec3\u8bc6\u522b\u82b1\u3001\u9e1f\u3001\u9c7c\u3001\u6606\u866b\u7b49\u5e38\u89c1\u7269\u4f53\uff0c\u800c\u4e14\u8fd8\u80fd\u591f\u5206\u6790\u56fe\u50cf\u4e2d\u7684\u6587\u672c\u3001\u56fe\u8868\u3001\u56fe\u6807\u3001\u56fe\u5f62\u548c\u5e03\u5c40\u3002<\/li>\n<li>\u4ee3\u7406\u6027\uff1aQwen2.5-VL\u76f4\u63a5\u626e\u6f14\u89c6\u89c9\u4ee3\u7406\u7684\u89d2\u8272\uff0c\u5177\u6709\u63a8\u7406\u548c\u52a8\u6001\u6307\u6325\u5de5\u5177\u7684\u529f\u80fd\uff0c\u53ef\u7528\u4e8e\u7535\u8111\u548c\u624b\u673a\u3002<\/li>\n<li>\u7406\u89e3\u957f\u89c6\u9891\u5e76\u6355\u6349\u4e8b\u4ef6\uff1aQwen2.5-VL \u53ef\u4ee5\u7406\u89e3\u8d85\u8fc7 1 \u5c0f\u65f6\u7684\u89c6\u9891\uff0c\u8fd9\u6b21\u5b83\u8fd8\u5177\u6709\u901a\u8fc7\u7cbe\u786e\u5b9a\u4f4d\u76f8\u5173\u89c6\u9891\u7247\u6bb5\u6765\u6355\u6349\u4e8b\u4ef6\u7684\u65b0\u529f\u80fd\u3002<\/li>\n<li>\u80fd\u591f\u8fdb\u884c\u4e0d\u540c\u683c\u5f0f\u7684\u89c6\u89c9\u5b9a\u4f4d\uff1aQwen2.5-VL \u53ef\u4ee5\u901a\u8fc7\u751f\u6210\u8fb9\u754c\u6846\u6216\u70b9\u6765\u51c6\u786e\u5b9a\u4f4d\u56fe\u50cf\u4e2d\u7684\u5bf9\u8c61\uff0c\u5e76\u4e14\u53ef\u4ee5\u4e3a\u5750\u6807\u548c\u5c5e\u6027\u63d0\u4f9b\u7a33\u5b9a\u7684 JSON \u8f93\u51fa\u3002<\/li>\n<li>\u751f\u6210\u7ed3\u6784\u5316\u8f93\u51fa\uff1a\u5bf9\u4e8e\u53d1\u7968\u3001\u8868\u683c\u3001\u8868\u683c\u7b49\u626b\u63cf\u4ef6\u6570\u636e\uff0cQwen2.5-VL \u652f\u6301\u5176\u5185\u5bb9\u7684\u7ed3\u6784\u5316\u8f93\u51fa\uff0c\u6709\u5229\u4e8e\u91d1\u878d\u3001\u5546\u4e1a\u7b49\u9886\u57df\u7684\u7528\u9014\u3002<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<p><strong>\u6a21\u578b\u67b6\u6784\uff1a<\/strong><\/p>\n<ul>\n<li>\u7528\u4e8e\u89c6\u9891\u7406\u89e3\u7684\u52a8\u6001\u5206\u8fa8\u7387\u548c\u5e27\u901f\u7387\u8bad\u7ec3\uff1a<\/li>\n<\/ul>\n<p>\u901a\u8fc7\u91c7\u7528\u52a8\u6001 FPS \u91c7\u6837\u5c06\u52a8\u6001\u5206\u8fa8\u7387\u6269\u5c55\u5230\u65f6\u95f4\u7ef4\u5ea6\uff0c\u4f7f\u6a21\u578b\u80fd\u591f\u7406\u89e3\u5404\u79cd\u91c7\u6837\u7387\u7684\u89c6\u9891\u3002\u76f8\u5e94\u5730\uff0cQwen\u56e2\u961f\u5728\u65f6\u95f4\u7ef4\u5ea6\u4e0a\u7528 ID \u548c\u7edd\u5bf9\u65f6\u95f4\u5bf9\u9f50\u66f4\u65b0 mRoPE\uff0c\u4f7f\u6a21\u578b\u80fd\u591f\u5b66\u4e60\u65f6\u95f4\u987a\u5e8f\u548c\u901f\u5ea6\uff0c\u6700\u7ec8\u83b7\u5f97\u7cbe\u786e\u5b9a\u4f4d\u7279\u5b9a\u65f6\u523b\u7684\u80fd\u529b\u3002<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/37ee67de1d9845b.jpeg\" \/><\/p>\n<p>&nbsp;<\/p>\n<ul>\n<li>\u7cbe\u7b80\u9ad8\u6548\u7684\u89c6\u89c9\u7f16\u7801\u5668<\/li>\n<\/ul>\n<p>Qwen\u56e2\u961f\u901a\u8fc7\u7b56\u7565\u6027\u5730\u5c06\u7a97\u53e3\u6ce8\u610f\u529b\u673a\u5236\u5f15\u5165 ViT\uff0c\u63d0\u9ad8\u4e86\u8bad\u7ec3\u548c\u63a8\u7406\u901f\u5ea6\u3002ViT \u67b6\u6784\u901a\u8fc7 SwiGLU \u548c RMSNorm \u5f97\u5230\u8fdb\u4e00\u6b65\u4f18\u5316\uff0c\u4f7f\u5176\u4e0e Qwen2.5 LLM \u7684\u7ed3\u6784\u4fdd\u6301\u4e00\u81f4\u3002<\/p>\n<p>\u672c\u6b21\u5f00\u6e90\u6709\u4e09\u4e2a\u6a21\u578b\uff0c\u53c2\u6570\u5206\u522b\u4e3a 30 \u4ebf\u300170 \u4ebf\u548c 720 \u4ebf\u3002\u6b64 repo \u5305\u542b\u6307\u4ee4\u8c03\u6574\u7684 72B <a href=\"https:\/\/www.kdjingpai.com\/ja\/qwen25-vl\/\">Qwen2.5-VL<\/a> \u6a21\u578b\u3002<\/p>\n<p>&nbsp;<\/p>\n<p><strong>\u6a21\u578b\u5408\u96c6\uff1a<\/strong><\/p>\n<p>https:\/\/www.modelscope.cn\/collections\/Qwen25-VL-58fbb5d31f1d47<\/p>\n<p><strong>\u6a21\u578b\u4f53\u9a8c\uff1a<\/strong><\/p>\n<p>https:\/\/chat.qwenlm.ai\/<\/p>\n<p><strong>\u6280\u672f\u535a\u5ba2\uff1a<\/strong><\/p>\n<p>https:\/\/qwenlm.github.io\/blog\/qwen2.5-vl\/<\/p>\n<p><strong>\u4ee3\u7801\u5730\u5740\uff1a<\/strong><\/p>\n<p>https:\/\/github.com\/QwenLM\/Qwen2.5-VL<\/p>\n<p>&nbsp;<\/p>\n<h2>2.<strong>\u6a21\u578b\u6548\u679c<\/strong><\/h2>\n<p>&nbsp;<\/p>\n<p>\u6a21\u578b\u8bc4\u4f30<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/78cfc4d09926d1b.png\" \/><\/p>\n<p>\u200d\u200d\u200d<\/p>\n<h2>3.<strong>\u6a21\u578b\u63a8\u7406<\/strong><\/h2>\n<h3>\u4f7f\u7528transformers\u63a8\u7406<\/h3>\n<p>Qwen2.5-VL \u7684\u4ee3\u7801\u5df2\u5728\u6700\u65b0\u7684transformers\u4e2d\uff0c\u5efa\u8bae\u4f7f\u7528\u547d\u4ee4\u4ece\u6e90\u4ee3\u7801\u6784\u5efa\uff1a<\/p>\n<pre><code>pip\u00a0install\u00a0git+https:\/\/github.com\/huggingface\/transformers<\/code><\/pre>\n<p>\u63d0\u4f9b\u4e86\u4e00\u4e2a\u5de5\u5177\u5305\uff0c\u53ef\u5e2e\u52a9\u66f4\u65b9\u4fbf\u5730\u5904\u7406\u5404\u79cd\u7c7b\u578b\u7684\u89c6\u89c9\u8f93\u5165\uff0c\u5c31\u50cf\u4f7f\u7528 API \u4e00\u6837\u3002\u8fd9\u5305\u62ec base64\u3001URL \u4ee5\u53ca\u4ea4\u9519\u7684\u56fe\u50cf\u548c\u89c6\u9891\u3002\u53ef\u4ee5\u4f7f\u7528\u4ee5\u4e0b\u547d\u4ee4\u5b89\u88c5\u5b83\uff1a<\/p>\n<pre><code>pip install qwen-vl-utils[decord]==0.0.8<\/code><\/pre>\n<p>\u63a8\u7406\u4ee3\u7801\uff1a<\/p>\n<pre>from transformers import Qwen2_5_VLForConditionalGeneration, AutoTokenizer, AutoProcessor\r\nfrom qwen_vl_utils import process_vision_info\r\nfrom modelscope import snapshot_download\r\n\r\n# Download and load the model\r\nmodel_dir = snapshot_download(\"Qwen\/Qwen2.5-VL-3B-Instruct\")\r\n\r\n# Default: Load the model on the available device(s)\r\nmodel = Qwen2_5_VLForConditionalGeneration.from_pretrained(\r\nmodel_dir, torch_dtype=\"auto\", device_map=\"auto\"\r\n)\r\n\r\n# Optional: Enable flash_attention_2 for better acceleration and memory saving\r\n# model = Qwen2_5_VLForConditionalGeneration.from_pretrained(\r\n# \"Qwen\/Qwen2.5-VL-3B-Instruct\",\r\n# torch_dtype=torch.bfloat16,\r\n# attn_implementation=\"flash_attention_2\",\r\n# device_map=\"auto\",\r\n# )\r\n\r\n# Load the default processor\r\nprocessor = AutoProcessor.from_pretrained(model_dir)\r\n\r\n# Optional: Set custom min and max pixels for visual <a href=\"https:\/\/www.kdjingpai.com\/ja\/tokenization\/\">token<\/a> range\r\n# min_pixels = 256 * 28 * 28\r\n# max_pixels = 1280 * 28 * 28\r\n# processor = AutoProcessor.from_pretrained(\r\n# \"Qwen\/Qwen2.5-VL-3B-Instruct\", min_pixels=min_pixels, max_pixels=max_pixels\r\n# )\r\n\r\n# Define input messages\r\nmessages = [\r\n{\r\n\"role\": \"user\",\r\n\"content\": [\r\n{\r\n\"type\": \"image\",\r\n\"image\": \"https:\/\/qianwen-res.oss-cn-beijing.aliyuncs.com\/Qwen-VL\/assets\/demo.jpeg\",\r\n},\r\n{\"type\": \"text\", \"text\": \"Describe this image.\"},\r\n],\r\n}\r\n]\r\n\r\n# Prepare inputs for inference\r\ntext = processor.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)\r\nimage_inputs, video_inputs = process_vision_info(messages)\r\ninputs = processor(\r\ntext=[text],\r\nimages=image_inputs,\r\nvideos=video_inputs,\r\npadding=True,\r\nreturn_tensors=\"pt\",\r\n)\r\ninputs = inputs.to(\"cuda\")\r\n\r\n# Inference: Generate output\r\ngenerated_ids = model.generate(**inputs, max_new_tokens=128)\r\ngenerated_ids_trimmed = [\r\nout_ids[len(in_ids):] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)\r\n]\r\noutput_text = processor.batch_decode(\r\ngenerated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False\r\n)\r\n\r\n# Print the generated output\r\nprint(output_text)<\/pre>\n<p>&nbsp;<\/p>\n<h3>\u4f7f\u7528\u9b54\u642dAPI-Inference\u76f4\u63a5\u8c03\u7528<\/h3>\n<p>\u9b54\u642d\u5e73\u53f0\u7684API-Inference\uff0c\u4e5f\u7b2c\u4e00\u65f6\u95f4\u4e3aQwen2.5-VL\u7cfb\u5217\u6a21\u578b\u63d0\u4f9b\u4e86\u652f\u6301\u3002\u9b54\u642d\u7684\u7528\u6237\u53ef\u901a\u8fc7API\u8c03\u7528\u7684\u65b9\u5f0f\uff0c\u76f4\u63a5\u4f7f\u7528\u3002\u5177\u4f53API-Inference\u7684\u4f7f\u7528\u65b9\u5f0f\u53ef\u53c2\u89c1\u6a21\u578b\u9875\u9762\uff08\u4f8b\u5982 \u00a0https:\/\/www.modelscope.cn\/models\/Qwen\/Qwen2.5-VL-72B-Instruct\uff09\u8bf4\u660e\uff1a<\/p>\n<p>&nbsp;<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/189ed6fc6f4afe8.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>\u6216\u8005\u53c2\u89c1API-Inference\u6587\u6863\uff1a<\/p>\n<p>https:\/\/www.modelscope.cn\/docs\/model-service\/API-Inference\/intro<\/p>\n<p>&nbsp;<\/p>\n<p>\u8fd9\u91cc\u4ee5\u5982\u4e0b\u56fe\u7247\u4e3a\u4f8b\uff0c\u8c03\u7528API\u4f7f\u7528Qwen\/Qwen2.5-VL-72B-Instruct\u6a21\u578b\uff1a<\/p>\n<p>&nbsp;<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/5122e08a3c140b8.jpeg\" width=\"709\" height=\"556\" \/><\/p>\n<p>&nbsp;<\/p>\n<pre>from openai import OpenAI\r\n\r\n# Initialize the OpenAI client\r\nclient = OpenAI(\r\napi_key=\"&lt;MODELSCOPE_SDK_TOKEN&gt;\", # ModelScope Token\r\nbase_url=\"https:\/\/api-inference.modelscope.cn\/v1\"\r\n)\r\n\r\n# Create a chat completion request\r\nresponse = client.chat.completions.create(\r\nmodel=\"Qwen\/Qwen2.5-VL-72B-Instruct\", # ModelScope Model-Id\r\nmessages=[\r\n{\r\n\"role\": \"user\",\r\n\"content\": [\r\n{\r\n\"type\": \"image_url\",\r\n\"image_url\": {\r\n\"url\": \"https:\/\/modelscope.oss-cn-beijing.aliyuncs.com\/demo\/images\/bird-vl.jpg\"\r\n}\r\n},\r\n{\r\n\"type\": \"text\",\r\n\"text\": (\r\n\"Count the number of birds in the figure, including those that \"\r\n\"are only showing their heads. To ensure accuracy, first detect \"\r\n\"their key points, then give the total number.\"\r\n)\r\n},\r\n],\r\n}\r\n],\r\nstream=True\r\n)\r\n\r\n# Stream the response\r\nfor chunk in response:\r\nprint(chunk.choices[0].delta.content, end='', flush=True)<\/pre>\n<p><img decoding=\"async\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/5aead7f78935550.png\" \/><\/p>\n<p><strong>\u00a0<\/strong><\/p>\n<h2>4.\u6a21\u578b\u5fae\u8c03<\/h2>\n<p>\u6211\u4eec\u4ecb\u7ecd\u4f7f\u7528ms-swift\u5bf9Qwen\/Qwen2.5-VL-7B-Instruct\u8fdb\u884c\u5fae\u8c03\u3002ms-swift\u662f\u9b54\u642d\u793e\u533a\u5b98\u65b9\u63d0\u4f9b\u7684\u5927\u6a21\u578b\u4e0e\u591a\u6a21\u6001\u5927\u6a21\u578b\u5fae\u8c03\u90e8\u7f72\u6846\u67b6\u3002ms-swift\u5f00\u6e90\u5730\u5740\uff1a<\/p>\n<p>https:\/\/github.com\/modelscope\/ms-swift<\/p>\n<p>&nbsp;<\/p>\n<p>\u5728\u8fd9\u91cc\uff0c\u6211\u4eec\u5c06\u5c55\u793a\u53ef\u8fd0\u884c\u7684\u5fae\u8c03demo\uff0c\u5e76\u7ed9\u51fa\u81ea\u5b9a\u4e49\u6570\u636e\u96c6\u7684\u683c\u5f0f\u3002<\/p>\n<p>\u5728\u5f00\u59cb\u5fae\u8c03\u4e4b\u524d\uff0c\u8bf7\u786e\u4fdd\u60a8\u7684\u73af\u5883\u5df2\u51c6\u5907\u59a5\u5f53\u3002<\/p>\n<pre>git clone https:\/\/github.com\/modelscope\/ms-swift.git\r\ncd ms-swift\r\npip install -e .<\/pre>\n<p>&nbsp;<\/p>\n<p>\u56fe\u50cfOCR\u5fae\u8c03\u811a\u672c\u5982\u4e0b\uff1a<\/p>\n<pre>MAX_PIXELS=1003520 \\\r\nCUDA_VISIBLE_DEVICES=0 \\\r\nswift sft \\\r\n    --model Qwen\/Qwen2.5-VL-7B-Instruct \\\r\n    --dataset AI-ModelScope\/LaTeX_OCR:human_handwrite#20000 \\\r\n    --train_type lora \\\r\n    --torch_dtype bfloat16 \\\r\n    --num_train_epochs 1 \\\r\n    --per_device_train_batch_size 1 \\\r\n    --per_device_eval_batch_size 1 \\\r\n    --learning_rate 1e-4 \\\r\n    --lora_rank 8 \\\r\n    --lora_alpha 32 \\\r\n    --target_modules all-linear \\\r\n    --freeze_vit true \\\r\n    --gradient_accumulation_steps 16 \\\r\n    --eval_steps 50 \\\r\n    --save_steps 50 \\\r\n    --save_total_limit 5 \\\r\n    --logging_steps 5 \\\r\n    --max_length 2048 \\\r\n    --output_dir output \\\r\n    --warmup_ratio 0.05 \\\r\n    --dataloader_num_workers 4\r\n<\/pre>\n<p>&nbsp;<\/p>\n<p>\u8bad\u7ec3\u663e\u5b58\u8d44\u6e90\uff1a<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/28be694a5985372.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>\u89c6\u9891\u5fae\u8c03\u811a\u672c\u5982\u4e0b\uff1a<\/p>\n<pre># VIDEO_MAX_PIXELS\u7b49\u53c2\u6570\u542b\u4e49\u53ef\u4ee5\u67e5\u770b\uff1a\r\n# https:\/\/swift.readthedocs.io\/zh-cn\/latest\/Instruction\/%E5%91%BD%E4%BB%A4%E8%A1%8C%E5%8F%82%E6%95%B0.html#id18\r\n\r\nnproc_per_node=2\r\nCUDA_VISIBLE_DEVICES=0,1 \\\r\nNPROC_PER_NODE=$nproc_per_node \\\r\nVIDEO_MAX_PIXELS=100352 \\\r\nFPS_MAX_FRAMES=24 \\\r\nswift sft \\\r\n    --model Qwen\/Qwen2.5-VL-7B-Instruct \\\r\n    --dataset swift\/VideoChatGPT:all \\\r\n    --train_type lora \\\r\n    --torch_dtype bfloat16 \\\r\n    --num_train_epochs 1 \\\r\n    --per_device_train_batch_size 1 \\\r\n    --per_device_eval_batch_size 1 \\\r\n    --learning_rate 1e-4 \\\r\n    --lora_rank 8 \\\r\n    --lora_alpha 32 \\\r\n    --target_modules all-linear \\\r\n    --freeze_vit true \\\r\n    --gradient_accumulation_steps $(expr 16 \/ $nproc_per_node) \\\r\n    --eval_steps 50 \\\r\n    --save_steps 50 \\\r\n    --save_total_limit 5 \\\r\n    --logging_steps 5 \\\r\n    --max_length 2048 \\\r\n    --output_dir output \\\r\n    --warmup_ratio 0.05 \\\r\n    --dataloader_num_workers 4 \\\r\n    --deepspeed zero2\r\n<\/pre>\n<p>&nbsp;<\/p>\n<p>\u8bad\u7ec3\u663e\u5b58\u8d44\u6e90\uff1a<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/7547e73b9fca83f.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>\u81ea\u5b9a\u4e49\u6570\u636e\u96c6\u683c\u5f0f\u5982\u4e0b\uff08system\u5b57\u6bb5\u53ef\u9009\uff09\uff0c\u53ea\u9700\u8981\u6307\u5b9a`&#8211;dataset &lt;dataset_path&gt;`\u5373\u53ef\uff1a<\/p>\n<pre>{\"messages\": [{\"role\": \"user\", \"content\": \"\u6d59\u6c5f\u7684\u7701\u4f1a\u5728\u54ea\uff1f\"}, {\"role\": \"assistant\", \"content\": \"\u6d59\u6c5f\u7684\u7701\u4f1a\u5728\u676d\u5dde\u3002\"}]}\r\n{\"messages\": [{\"role\": \"user\", \"content\": \"&lt;image&gt;&lt;image&gt;\u4e24\u5f20\u56fe\u7247\u6709\u4ec0\u4e48\u533a\u522b\"}, {\"role\": \"assistant\", \"content\": \"\u524d\u4e00\u5f20\u662f\u5c0f\u732b\uff0c\u540e\u4e00\u5f20\u662f\u5c0f\u72d7\"}], \"images\": [\"\/xxx\/x.jpg\", \"xxx\/x.png\"]}\r\n{\"messages\":\u00a0[{\"role\":\u00a0\"system\",\u00a0\"content\":\u00a0\"\u4f60\u662f\u4e2a\u6709\u7528\u65e0\u5bb3\u7684\u52a9\u624b\"},\u00a0{\"role\":\u00a0\"user\",\u00a0\"content\":\u00a0\"&lt;video&gt;\u89c6\u9891\u4e2d\u662f\u4ec0\u4e48\"},\u00a0{\"role\":\u00a0\"assistant\",\u00a0\"content\":\u00a0\"\u89c6\u9891\u4e2d\u662f\u4e00\u53ea\u5c0f\u72d7\u5728\u8349\u5730\u4e0a\u5954\u8dd1\"}],\u00a0\"videos\":\u00a0[\"\/xxx\/x.mp4\"]}<\/pre>\n<p>grounding\u4efb\u52a1\u5fae\u8c03\u811a\u672c\u5982\u4e0b\uff1a<\/p>\n<pre>CUDA_VISIBLE_DEVICES=0 \\\r\nMAX_PIXELS=1003520 \\\r\nswift sft \\\r\n    --model Qwen\/Qwen2.5-VL-7B-Instruct \\\r\n    --dataset 'AI-ModelScope\/coco#20000' \\\r\n    --train_type lora \\\r\n    --torch_dtype bfloat16 \\\r\n    --num_train_epochs 1 \\\r\n    --per_device_train_batch_size 1 \\\r\n    --per_device_eval_batch_size 1 \\\r\n    --learning_rate 1e-4 \\\r\n    --lora_rank 8 \\\r\n    --lora_alpha 32 \\\r\n    --target_modules all-linear \\\r\n    --freeze_vit true \\\r\n    --gradient_accumulation_steps 16 \\\r\n    --eval_steps 100 \\\r\n    --save_steps 100 \\\r\n    --save_total_limit 2 \\\r\n    --logging_steps 5 \\\r\n    --max_length 2048 \\\r\n    --output_dir output \\\r\n    --warmup_ratio 0.05 \\\r\n    --dataloader_num_workers 4 \\\r\n    --dataset_num_proc 4\r\n<\/pre>\n<p>&nbsp;<\/p>\n<p>\u8bad\u7ec3\u663e\u5b58\u8d44\u6e90\uff1a<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/01\/e03202aae563e5d.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>grounding\u4efb\u52a1\u81ea\u5b9a\u4e49\u6570\u636e\u96c6\u683c\u5f0f\u5982\u4e0b\uff1a<\/p>\n<pre>{\"messages\": [{\"role\": \"system\", \"content\": \"You are a helpful assistant.\"}, {\"role\": \"user\", \"content\": \"&lt;image&gt;\u63cf\u8ff0\u56fe\u50cf\"}, {\"role\": \"assistant\", \"content\": \"&lt;ref-object&gt;&lt;bbox&gt;\u548c&lt;ref-object&gt;&lt;bbox&gt;\u6b63\u5728\u6c99\u6ee9\u4e0a\u73a9\u800d\"}], \"images\": [\"\/xxx\/x.jpg\"], \"objects\": {\"ref\": [\"\u4e00\u53ea\u72d7\", \"\u4e00\u4e2a\u5973\u4eba\"], \"bbox\": [[331.5, 761.4, 853.5, 1594.8], [676.5, 685.8, 1099.5, 1427.4]]}}\r\n{\"messages\": [{\"role\": \"system\", \"content\": \"You are a helpful assistant.\"}, {\"role\": \"user\", \"content\": \"&lt;image&gt;\u627e\u5230\u56fe\u50cf\u4e2d\u7684&lt;ref-object&gt;\"}, {\"role\": \"assistant\", \"content\": \"&lt;bbox&gt;&lt;bbox&gt;\"}], \"images\": [\"\/xxx\/x.jpg\"], \"objects\": {\"ref\": [\"\u7f8a\"], \"bbox\": [[90.9, 160.8, 135, 212.8], [360.9, 480.8, 495, 532.8]]}}\r\n<\/pre>\n<p>\u8bad\u7ec3\u5b8c\u6210\u540e\uff0c\u4f7f\u7528\u4ee5\u4e0b\u547d\u4ee4\u5bf9\u8bad\u7ec3\u65f6\u7684\u9a8c\u8bc1\u96c6\u8fdb\u884c\u63a8\u7406\uff0c<\/p>\n<p>\u8fd9\u91cc`&#8211;adapters`\u9700\u8981\u66ff\u6362\u6210\u8bad\u7ec3\u751f\u6210\u7684last checkpoint\u6587\u4ef6\u5939. \u7531\u4e8eadapters\u6587\u4ef6\u5939\u4e2d\u5305\u542b\u4e86\u8bad\u7ec3\u7684\u53c2\u6570\u6587\u4ef6\uff0c\u56e0\u6b64\u4e0d\u9700\u8981\u989d\u5916\u6307\u5b9a`&#8211;model`\uff1a<\/p>\n<pre><code>CUDA_VISIBLE_DEVICES=0 <\/code><code>swift infer <\/code><code>    --adapters output\/vx-xxx\/checkpoint-xxx <\/code><code>    --stream false <\/code><code>    --max_batch_size 1 <\/code><code>    --load_data_args true <\/code><code>    --max_new_tokens 2048<\/code><\/pre>\n<p>&nbsp;<\/p>\n<p>\u63a8\u9001\u6a21\u578b\u5230ModelScope\uff1a<\/p>\n<pre><code>CUDA_VISIBLE_DEVICES=0 <\/code><code>swift export <\/code><code>    --adapters output\/vx-xxx\/checkpoint-xxx <\/code><code>    --push_to_hub true <\/code><code>    --hub_model_id '&lt;your-model-id&gt;' <\/code><code>\u00a0\u00a0\u00a0\u00a0--hub_token\u00a0'&lt;your-sdk-token&gt;'<\/code><\/pre>\n","protected":false},"excerpt":{"rendered":"<p>1.\u6a21\u578b\u4ecb\u7ecd \u81ea Qwen2-VL \u53d1\u5e03\u4ee5\u6765\u7684\u4e94\u4e2a\u6708\u91cc\uff0c\u4f17\u591a\u5f00\u53d1\u8005\u5728 Qwen2-VL \u89c6\u89c9\u8bed\u8a00\u6a21\u578b\u4e0a\u6784\u5efa\u4e86\u65b0\u6a21\u578b\uff0c\u4e3aQwen\u56e2\u961f\u63d0\u4f9b\u4e86\u5b9d\u8d35\u7684\u53cd\u9988\u3002\u5728\u6b64\u671f\u95f4\uff0cQwen\u56e2\u961f\u4e13\u6ce8\u4e8e\u6784\u5efa\u66f4\u6709\u7528\u7684\u89c6\u89c9\u8bed\u8a00\u6a21\u578b\u3002\u4eca\u5929\uff0cQwen\u56e2\u961f\u5f88\u9ad8\u5174\u5411\u5927\u5bb6\u4ecb\u7ecd Q&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[46],"tags":[],"class_list":["post-19586","post","type-post","status-publish","format-standard","hentry","category-news"],"_links":{"self":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/posts\/19586","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/comments?post=19586"}],"version-history":[{"count":0,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/posts\/19586\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/media?parent=19586"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/categories?post=19586"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/tags?post=19586"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}