{"id":28021,"date":"2025-03-10T01:14:42","date_gmt":"2025-03-09T17:14:42","guid":{"rendered":"https:\/\/www.aisharenet.com\/?p=28021"},"modified":"2025-03-10T01:14:42","modified_gmt":"2025-03-09T17:14:42","slug":"megapairs","status":"publish","type":"post","link":"https:\/\/www.kdjingpai.com\/de\/megapairs\/","title":{"rendered":"MegaPairs\uff1aBGE\u65b0\u63a8\u51fa\u7684\u591a\u6a21\u6001\u5411\u91cf\u5d4c\u5165\u6a21\u578b"},"content":{"rendered":"<p>MegaPairs \u662f VectorSpaceLab \u56e2\u961f\u5728 GitHub \u4e0a\u5f00\u6e90\u7684\u9879\u76ee\uff0c\u901a\u8fc7\u5927\u89c4\u6a21\u6570\u636e\u5408\u6210\u6280\u672f\u751f\u6210\u591a\u6a21\u6001\u5d4c\u5165\u6a21\u578b\uff0c\u7528\u4e8e\u56fe\u50cf-\u6587\u672c\u5230\u56fe\u50cf\u7684\u68c0\u7d22\u4efb\u52a1\u3002\u9879\u76ee\u57fa\u4e8e\u8d85\u8fc72600\u4e07\u4e2a\u5f02\u6784 KNN \u4e09\u5143\u7ec4\u6570\u636e\u96c6\uff0c\u8bad\u7ec3\u4e86 BGE-VL \u7cfb\u5217\u6a21\u578b\uff0c\u5305\u62ec BGE-VL-CLIP\uff08base \u548c large \u7248\u672c\uff09\u548c BGE-VL-MLLM\uff08S1 \u548c S2 \u7248\u672c\uff09\u3002\u5176\u4e2d\uff0cBGE-VL-MLLM-S1 \u5728 CIRCO \u96f6\u6837\u672c\u56fe\u50cf\u68c0\u7d22\u57fa\u51c6\u4e0a\u63d0\u5347\u4e86 8.1% \u7684\u6027\u80fd\uff08mAP@5\uff09\uff0c\u5728 MMEB \u591a\u6a21\u6001\u5d4c\u5165\u57fa\u51c6\u4e2d\u4e5f\u8868\u73b0\u51fa\u8272\u3002\u4ee3\u7801\u548c\u6a21\u578b\u5df2\u5f00\u6e90\u4e8e GitHub \u548c Hugging Face\uff0c\u6570\u636e\u96c6\u8ba1\u5212\u540e\u7eed\u53d1\u5e03\uff0c\u91c7\u7528 MIT \u8bb8\u53ef\u8bc1\uff0c\u6570\u636e\u6e90\u81ea Recap-Datacomp\uff08CC BY 4.0 \u8bb8\u53ef\uff09\u3002<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter  wp-image-28022\" title=\"MegaPairs\uff1aBGE\u65b0\u63a8\u51fa\u7684\u591a\u6a21\u6001\u5411\u91cf\u5d4c\u5165\u6a21\u578b-1\" src=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/03\/8c4bae6a30f7477.jpg\" alt=\"MegaPairs\uff1aBGE\u65b0\u63a8\u51fa\u7684\u591a\u6a21\u6001\u5411\u91cf\u5d4c\u5165\u6a21\u578b-1\" width=\"700\" height=\"350\" srcset=\"https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/03\/8c4bae6a30f7477.jpg 1200w, https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/03\/8c4bae6a30f7477-768x384.jpg 768w, https:\/\/www.kdjingpai.com\/wp-content\/uploads\/2025\/03\/8c4bae6a30f7477-18x9.jpg 18w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/><\/p>\n<p>&nbsp;<\/p>\n<h2>\u529f\u80fd\u5217\u8868<\/h2>\n<ul>\n<li><strong>\u751f\u6210\u5927\u89c4\u6a21\u6570\u636e\u96c6<\/strong>: \u63d0\u4f9b\u8d85\u8fc72600\u4e07\u4e2a\u5f02\u6784 KNN \u4e09\u5143\u7ec4\uff0c\u7528\u4e8e\u8bad\u7ec3\u591a\u6a21\u6001\u5d4c\u5165\u6a21\u578b\u3002<\/li>\n<li><strong>BGE-VL-CLIP \u5d4c\u5165\u6a21\u578b<\/strong>: \u5305\u62ec base \u548c large \u7248\u672c\uff0c\u751f\u6210\u56fe\u50cf\u548c\u6587\u672c\u7684\u5d4c\u5165\u8868\u793a\uff0c\u652f\u6301\u9ad8\u6548\u68c0\u7d22\u3002<\/li>\n<li><strong>BGE-VL-MLLM \u5d4c\u5165\u6a21\u578b<\/strong>: \u63d0\u4f9b S1 \u548c S2 \u7248\u672c\uff0c\u751f\u6210\u9ad8\u6027\u80fd\u591a\u6a21\u6001\u5d4c\u5165\uff0c\u652f\u6301\u96f6\u6837\u672c\u68c0\u7d22\u3002<\/li>\n<li><strong>\u652f\u6301\u96f6\u6837\u672c\u68c0\u7d22<\/strong>: \u65e0\u9700\u8bad\u7ec3\u5373\u53ef\u751f\u6210\u5d4c\u5165\u5e76\u5b8c\u6210\u56fe\u50cf-\u6587\u672c\u68c0\u7d22\u4efb\u52a1\u3002<\/li>\n<li><strong>\u6a21\u578b\u5f00\u6e90\u4e0e\u6269\u5c55<\/strong>: \u5728 Hugging Face \u63d0\u4f9b\u9884\u8bad\u7ec3\u6a21\u578b\uff0c\u652f\u6301\u4e0b\u8f7d\u3001\u4f7f\u7528\u548c\u5fae\u8c03\u3002<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h2>\u4f7f\u7528\u5e2e\u52a9<\/h2>\n<p>MegaPairs \u901a\u8fc7 GitHub \u548c Hugging Face \u5206\u53d1\u4ee3\u7801\u548c\u6a21\u578b\uff0c\u7528\u6237\u53ef\u4ee5\u5feb\u901f\u751f\u6210\u591a\u6a21\u6001\u5d4c\u5165\u5e76\u5b8c\u6210\u68c0\u7d22\u4efb\u52a1\u3002\u4ee5\u4e0b\u662f\u8be6\u7ec6\u64cd\u4f5c\u6307\u5357\uff0c\u57fa\u4e8e BGE-VL-MLLM-S1 \u7684\u5b98\u65b9\u8bf4\u660e\uff08Hugging Face\uff09\u3002<\/p>\n<h3>\u83b7\u53d6\u4e0e\u5b89\u88c5<\/h3>\n<ol>\n<li><strong>\u8bbf\u95ee GitHub \u4ed3\u5e93<\/strong>: \u6253\u5f00\u00a0<code>https:\/\/github.com\/VectorSpaceLab\/MegaPairs<\/code>\uff0c\u67e5\u770b\u9879\u76ee\u8be6\u60c5\u3002<\/li>\n<li><strong>\u514b\u9686\u4ed3\u5e93<\/strong>: \u5728\u7ec8\u7aef\u8fd0\u884c\u4ee5\u4e0b\u547d\u4ee4\u4e0b\u8f7d\u4ee3\u7801\uff1a<\/li>\n<\/ol>\n<pre><code>git clone https:\/\/github.com\/VectorSpaceLab\/MegaPairs.git\r\ncd MegaPairs\r\n<\/code><\/pre>\n<ol start=\"3\">\n<li><strong>\u5b89\u88c5\u4f9d\u8d56<\/strong>: \u4f7f\u7528 Python 3.10\uff0c\u521b\u5efa\u865a\u62df\u73af\u5883\u5e76\u5b89\u88c5\u5fc5\u8981\u5e93\uff1a<\/li>\n<\/ol>\n<pre><code>python -m venv venv\r\nsource venv\/bin\/activate  # Linux\/Mac\r\nvenv\\Scripts\\activate     # Windows\r\npip install torch transformers==4.41.2 sentencepiece\r\n<\/code><\/pre>\n<p>Hugging Face \u8981\u6c42\u00a0<code>transformers==4.41.2<\/code>\u00a0\u548c\u00a0<code>sentencepiece<\/code>\u3002<br \/>\n4.\u00a0<strong>\u4e0b\u8f7d\u6a21\u578b<\/strong>: \u4ece Hugging Face \u83b7\u53d6 BGE-VL-MLLM-S1\uff1a<\/p>\n<ul>\n<li>\u8bbf\u95ee\u00a0https:\/\/huggingface.co\/BAAI\/BGE-VL-MLLM-S1<\/li>\n<li>\u901a\u8fc7 Python \u811a\u672c\u81ea\u52a8\u4e0b\u8f7d\uff08\u89c1\u4e0b\u6587\uff09\u3002<\/li>\n<\/ul>\n<h3>\u4f7f\u7528\u4e3b\u8981\u529f\u80fd<\/h3>\n<h4>1. \u6570\u636e\u96c6\u4f7f\u7528<\/h4>\n<p>MegaPairs \u6570\u636e\u96c6\u5305\u542b 2600 \u4e07\u4e2a\u4e09\u5143\u7ec4\uff0c\u7528\u4e8e\u8bad\u7ec3\u591a\u6a21\u6001\u5d4c\u5165\u6a21\u578b\uff0c\u76ee\u524d\u5c1a\u672a\u5b8c\u5168\u53d1\u5e03\uff0c\u8ba1\u5212\u901a\u8fc7\u00a0<a href=\"https:\/\/huggingface.co\/datasets\/BAAI\/MegaPairs\">Hugging Face<\/a>\u00a0\u63d0\u4f9b\u3002<\/p>\n<ul>\n<li><strong>\u83b7\u53d6\u65b9\u5f0f<\/strong>: \u5173\u6ce8\u5b98\u65b9\u66f4\u65b0\uff0c\u4e0b\u8f7d\u540e\u53ef\u7528\u4e8e\u6a21\u578b\u8bad\u7ec3\u6216\u9a8c\u8bc1\u3002<\/li>\n<li><strong>\u6570\u636e\u683c\u5f0f<\/strong>: \u4e09\u5143\u7ec4\uff08\u67e5\u8be2\u56fe\u50cf\u3001\u6587\u672c\u63cf\u8ff0\u3001\u76ee\u6807\u56fe\u50cf\uff09\uff0c\u652f\u6301\u5d4c\u5165\u751f\u6210\u548c\u68c0\u7d22\u3002<\/li>\n<\/ul>\n<h4>2. \u751f\u6210\u591a\u6a21\u6001\u5d4c\u5165\uff08BGE-VL-MLLM-S1\uff09<\/h4>\n<p>BGE-VL-MLLM-S1 \u662f\u6838\u5fc3\u5d4c\u5165\u6a21\u578b\uff0c\u7528\u4e8e\u751f\u6210\u56fe\u50cf\u548c\u6587\u672c\u7684\u5d4c\u5165\u8868\u793a\u5e76\u5b8c\u6210\u68c0\u7d22\u3002\u4ee5\u4e0b\u662f\u5b98\u65b9\u4ee3\u7801\uff1a<\/p>\n<ul>\n<li><strong>\u52a0\u8f7d\u6a21\u578b<\/strong>:<\/li>\n<\/ul>\n<pre><code>import torch\r\nfrom transformers import AutoModel, AutoProcessor\r\nmodel_name = \"BAAI\/BGE-VL-MLLM-S1\"\r\nprocessor = AutoProcessor.from_pretrained(model_name, trust_remote_code=True)\r\nmodel = AutoModel.from_pretrained(model_name, trust_remote_code=True)\r\nmodel.eval()\r\nmodel.cuda()  # \u4f7f\u7528 GPU \u52a0\u901f\r\n<\/code><\/pre>\n<ul>\n<li><strong>\u751f\u6210\u5d4c\u5165\u5e76\u68c0\u7d22<\/strong>:\n<pre><code>from PIL import Image\r\n# \u51c6\u5907\u8f93\u5165\r\nquery_image = Image.open(\".\/cir_query.png\").convert(\"RGB\")\r\nquery_text = \"Make the background dark, as if the camera has taken the photo at night\"\r\ncandidate_images = [Image.open(\".\/cir_candi_1.png\").convert(\"RGB\"), Image.open(\".\/cir_candi_2.png\").convert(\"RGB\")]\r\n# \u5904\u7406\u67e5\u8be2\u6570\u636e\r\nquery_inputs = processor(\r\ntext=query_text,\r\nimages=query_image,\r\ntask_instruction=\"Retrieve the target image that best meets the combined criteria by using both the provided image and the image <a href=\"https:\/\/www.kdjingpai.com\/de\/retrieval\/\">retrieval<\/a> instructions: \",\r\nreturn_tensors=\"pt\",\r\nq_or_c=\"q\"\r\n)\r\nquery_inputs = {k: v.cuda() for k, v in query_inputs.items()}\r\n# \u5904\u7406\u5019\u9009\u6570\u636e\r\ncandidate_inputs = processor(\r\nimages=candidate_images,\r\nreturn_tensors=\"pt\",\r\nq_or_c=\"c\"\r\n)\r\ncandidate_inputs = {k: v.cuda() for k, v in candidate_inputs.items()}\r\n# \u751f\u6210\u5d4c\u5165\u5e76\u8ba1\u7b97\u76f8\u4f3c\u5ea6\r\nwith torch.no_grad():\r\nquery_embs = model(**query_inputs, output_hidden_states=True).hidden_states[-1][:, -1, :]\r\ncandi_embs = model(**candidate_inputs, output_hidden_states=True).hidden_states[-1][:, -1, :]\r\nquery_embs = torch.nn.functional.normalize(query_embs, dim=-1)\r\ncandi_embs = torch.nn.functional.normalize(candi_embs, dim=-1)\r\nscores = torch.matmul(query_embs, candi_embs.T)\r\nprint(scores)  # \u8f93\u51fa\u76f8\u4f3c\u5ea6\u5f97\u5206\r\n<\/code><\/pre>\n<ul>\n<li><strong>\u7ed3\u679c\u89e3\u91ca<\/strong>:\u00a0<code>scores<\/code>\u00a0\u8868\u793a\u67e5\u8be2\u5d4c\u5165\u4e0e\u5019\u9009\u5d4c\u5165\u7684\u76f8\u4f3c\u5ea6\uff0c\u5f97\u5206\u8d8a\u9ad8\u5339\u914d\u5ea6\u8d8a\u9ad8\u3002<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<h4>3. \u4f7f\u7528 BGE-VL-CLIP \u751f\u6210\u5d4c\u5165<\/h4>\n<p>BGE-VL-CLIP\uff08base\/large\uff09\u4e5f\u53ef\u751f\u6210\u591a\u6a21\u6001\u5d4c\u5165\uff1a<\/p>\n<ul>\n<li><strong>\u52a0\u8f7d\u4e0e\u8fd0\u884c<\/strong>:\n<pre><code>from transformers import AutoModel\r\nmodel_name = \"BAAI\/BGE-VL-base\"\r\nmodel = AutoModel.from_pretrained(model_name, trust_remote_code=True)\r\nmodel.set_processor(model_name)\r\nmodel.eval()\r\nwith torch.no_grad():\r\nquery = model.encode(images=\".\/cir_query.png\", text=\"Make the background dark\")\r\ncandidates = model.encode(images=[\".\/cir_candi_1.png\", \".\/cir_candi_2.png\"])\r\nscores = query @ candidates.T\r\nprint(scores)\r\n<\/code><\/pre>\n<\/li>\n<\/ul>\n<h4>4. \u6a21\u578b\u5fae\u8c03<\/h4>\n<p>\u7528\u6237\u53ef\u5229\u7528\u6570\u636e\u96c6\u5fae\u8c03\u6a21\u578b\uff1a<\/p>\n<ul>\n<li><strong>\u6570\u636e\u51c6\u5907<\/strong>: \u51c6\u5907\u56fe\u50cf-\u6587\u672c\u5bf9\u6216\u4e09\u5143\u7ec4\u3002<\/li>\n<li><strong>\u5fae\u8c03\u6d41\u7a0b<\/strong>: \u5fae\u8c03\u4ee3\u7801\u5f85\u53d1\u5e03\uff0c\u53ef\u53c2\u8003\u00a0<code>transformers<\/code>\u00a0\u7684\u00a0<code>Trainer<\/code>\u00a0API\u3002<\/li>\n<li><strong>\u9a8c\u8bc1<\/strong>: \u4f7f\u7528 CIRCO \u6216 MMEB \u57fa\u51c6\u6d4b\u8bd5\u6548\u679c\u3002<\/li>\n<\/ul>\n<h3>\u7279\u8272\u529f\u80fd\u64cd\u4f5c<\/h3>\n<h4>\u96f6\u6837\u672c\u5d4c\u5165\u751f\u6210\u4e0e\u68c0\u7d22<\/h4>\n<p>BGE-VL-MLLM-S1 \u652f\u6301\u96f6\u6837\u672c\u64cd\u4f5c\uff1a<\/p>\n<ul>\n<li>\u8f93\u5165\u56fe\u50cf\u548c\u6587\u672c\uff0c\u751f\u6210\u5d4c\u5165\u540e\u76f4\u63a5\u68c0\u7d22\uff0c\u65e0\u9700\u8bad\u7ec3\u3002<\/li>\n<li>\u5728 CIRCO \u4e0a\u63d0\u5347 8.1% \u7684 mAP@5\u3002<\/li>\n<\/ul>\n<h4>\u9ad8\u6027\u80fd\u4e0e\u53ef\u6269\u5c55\u6027<\/h4>\n<ul>\n<li><strong>\u6027\u80fd<\/strong>: \u5728 MMEB \u4e0a\u751f\u6210\u4f18\u5f02\u7684\u591a\u6a21\u6001\u5d4c\u5165\uff0cS2 \u7248\u672c\u8fdb\u4e00\u6b65\u4f18\u5316\u3002<\/li>\n<li><strong>\u6269\u5c55\u6027<\/strong>: \u6570\u636e\u91cf\u589e\u52a0\u65f6\u5d4c\u5165\u8d28\u91cf\u63d0\u5347\uff0c50 \u4e07\u6837\u672c\u5df2\u8d85\u8d8a\u4f20\u7edf\u6a21\u578b\u3002<\/li>\n<\/ul>\n<h3>\u6ce8\u610f\u4e8b\u9879<\/h3>\n<ul>\n<li><strong>\u786c\u4ef6\u8981\u6c42<\/strong>: \u63a8\u8350 GPU\uff0816GB \u663e\u5b58\u4ee5\u4e0a\uff09\u3002<\/li>\n<li><strong>\u4f9d\u8d56\u7248\u672c<\/strong>: \u4f7f\u7528\u00a0<code>transformers==4.41.2<\/code>\u00a0\u548c\u00a0<code>sentencepiece<\/code>\u3002<\/li>\n<li><strong>\u6587\u6863\u652f\u6301<\/strong>: \u67e5\u770b GitHub \u548c Hugging Face \u9875\u9762\u3002<\/li>\n<li><strong>\u793e\u533a\u5e2e\u52a9<\/strong>: \u5728 GitHub Issues \u6216 Hugging Face Discussions \u4e2d\u63d0\u95ee\u3002<\/li>\n<\/ul>\n<p>\u901a\u8fc7\u4ee5\u4e0a\u6b65\u9aa4\uff0c\u7528\u6237\u53ef\u4ee5\u751f\u6210\u591a\u6a21\u6001\u5d4c\u5165\u5e76\u5b8c\u6210\u68c0\u7d22\u4efb\u52a1\u3002<\/p>\n","protected":false},"excerpt":{"rendered":"<p>MegaPairs \u662f VectorSpaceLab \u56e2\u961f\u5728 GitHub \u4e0a\u5f00\u6e90\u7684\u9879\u76ee\uff0c\u901a\u8fc7\u5927\u89c4\u6a21\u6570\u636e\u5408\u6210\u6280\u672f\u751f\u6210\u591a\u6a21\u6001\u5d4c\u5165\u6a21\u578b\uff0c\u7528\u4e8e\u56fe\u50cf-\u6587\u672c\u5230\u56fe\u50cf\u7684\u68c0\u7d22\u4efb\u52a1\u3002\u9879\u76ee\u57fa\u4e8e\u8d85\u8fc72600\u4e07\u4e2a\u5f02\u6784 KNN \u4e09\u5143\u7ec4\u6570\u636e\u96c6\uff0c\u8bad\u7ec3\u4e86 BGE-VL \u7cfb&#8230;<\/p>\n","protected":false},"author":1,"featured_media":62000,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[230],"class_list":["post-28021","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tool","tag-aikaiyuanxiangmu"],"_links":{"self":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/posts\/28021","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/comments?post=28021"}],"version-history":[{"count":0,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/posts\/28021\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/media\/62000"}],"wp:attachment":[{"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/media?parent=28021"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/categories?post=28021"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kdjingpai.com\/de\/wp-json\/wp\/v2\/tags?post=28021"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}