Overseas access: www.kdjingpai.com
Bookmark Us

claude-worker-proxyis a proxy service deployed on Cloudflare Workers. Its core functionality is to convert the request format of a variety of large model APIs, such as Google Gemini and OpenAI, into the Anthropic Claude format. This allows client applications that originally needed to adapt the Claude API format, such as Claude Code, to directly request models from Gemini or OpenAI, thus reducing development and maintenance costs. The project supports streaming and non-streaming responses, tool calls, and can be deployed with a single click, enabling out-of-the-box use.

Function List

  • API format conversion: Seamlessly convert API request and response formats for models such as Gemini and OpenAI to the Claude API format.
  • Compatible with Claude Ecology: Enables clients designed for the Claude API (e.g., Claude Code) to use other vendors' AI models directly.
  • One-Click Deployment: Users can quickly deploy through the Cloudflare Workers platform without complex server configurations.
  • Streaming response support: Compatible with streaming API response modes that require real-time return of data.
  • Support for tool calls: Support for AI models to interact with external tools or functions.
  • Zero-configuration startup:: The project is designed to be simple and ready for users to use right after deployment, with no additional configuration required.

Using Help

Deployment and utilizationclaude-worker-proxyIt's very simple and relies heavily on the Cloudflare Workers platform and the npm package manager.

Pre-installation

  1. A Cloudflare account: You need to register and log in to Cloudflare.
  2. Node.js and npm: Make sure you have Node.js and npm installed in your development environment.
  3. Wrangler CLI: Wrangler is the official Cloudflare command line tool for managing the Workers project. It can be installed globally via npm if not already installed:
    npm install -g wrangler@latest
    

Deployment process

  1. Cloning Project Code
    First, clone the project's source code locally from GitHub.

    git clone https://github.com/glidea/claude-worker-proxy
    
  2. Go to the project directory and install the dependencies
    cd claude-worker-proxy
    npm install
    
  3. Log in to Wrangler
    Execute the login command and it will open a browser window for you to authorize access to your Cloudflare account.

    wrangler login
    
  4. Execute the deployment command
    In the project root directory, run the deployment script. This script will package the code and deploy it to a new Worker service under your Cloudflare account.

    npm run deploycf
    

    After successful deployment, the command line outputs a.workers.devThe URL with the suffix, which is where your proxy service will be accessed.

How to use

After successful deployment, you can use this proxy by sending an HTTP POST request to your Worker URL. there are specific requirements for the format of the URL and the request header.

URL format

The URL structure of the request is as follows:
{你的Worker URL}/{目标模型类型}/{目标API基础地址}/v1/messages

  • {你的Worker URL}: address obtained after successful deployment, e.g. https://claude-worker-proxy.xxxx.workers.devThe
  • {目标模型类型}:: Currently supported gemini cap (a poem) openaiThe
  • {目标API基础地址}: The official API base address of the target vendor.Need to bring the API version number. For example, Gemini's address is https://generativelanguage.googleapis.com/v1betaThe

Request Header

The request must include a specific request header to pass the target vendor's API key:

  • x-api-keyThe : value is your target vendor API key (e.g. your Gemini API Key).

Request example: requesting Gemini using curl

The following example shows how to send a request to a Google Gemini model through a proxy, in Claude's format:

curl -X POST https://claude-worker-proxy.xxxx.workers.dev/gemini/https://generativelanguage.googleapis.com/v1beta/v1/messages \
-H "x-api-key: YOUR_GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gemini-1.5-flash",
"messages": [
{"role": "user", "content": "Hello"}
]
}'

please includehttps://claude-worker-proxy.xxxx.workers.devReplace it with your actual Worker address and theYOUR_GEMINI_API_KEYReplace it with your valid Gemini API key.

Use in Claude Code

For Claude Code users, API requests can be directed to this proxy service by modifying their configuration file.

  1. Edit Configuration File ~/.claude/settings.jsonThe
  2. Modify the environment variables in it, as shown in the example below:
    {
    "env": {
    "ANTHROPIC_BASE_URL": "https://claude-worker-proxy.xxxx.workers.dev/gemini/https://generativelanguage.googleapis.com/v1beta",
    "ANTHROPIC_CUSTOM_HEADERS": "x-api-key: YOUR_GEMINI_API_KEY",
    "ANTHROPIC_MODEL": "gemini-1.5-pro",
    "ANTHROPIC_SMALL_FAST_MODEL": "gemini-1.5-flash",
    "API_TIMEOUT_MS": "600000"
    }
    }
    
    • ANTHROPIC_BASE_URL: Set it to your proxy URL and splice in the target model type and API address.
    • ANTHROPIC_CUSTOM_HEADERS: Set the API key.
    • ANTHROPIC_MODEL: Specify the large model you wish to use.
    • ANTHROPIC_SMALL_FAST_MODEL: Specify the miniatures you wish to use.
  3. After saving the configuration file, run it directly in a terminal claude command to use the configured model through the proxy.

application scenario

  1. Unified Development Interface (UDI)
    For applications that have adapted the Claude API, developers don't need to rewrite the API request logic for different models such as Gemini or OpenAI. Simply point requests to this proxy to quickly switch and use models from different vendors.
  2. Use with specific tools
    Tools like Claude Code, which are natively designed for the Claude API, can seamlessly switch to other more cost-effective or performance-advantaged models through this agent, such as using the Gemini model for code-related tasks.
  3. Streamline front-end application development
    This Worker agent can be used as a unified back-end API gateway when developing web or desktop applications. The front-end application only needs to follow one API format, while the proxy layer handles the communication details with different AI service providers.

QA

  1. Is this program free?
    The project source code is open source and free. However, deployment on Cloudflare Workers consumes platform resources, and Cloudflare has a free quota beyond which you need to pay. Also, the target model APIs you call (e.g. Gemini) are paid per usage to their providers (e.g. Google).
  2. Why do I need to provide the target vendor's API base address in the URL?
    This design provides greater flexibility. It allows the user to dynamically specify any API service endpoint compatible with OpenAI or Gemini formats, rather than hard-coding it into the code. This means that if the vendor updates the API version in the future or you use a third-party compatible API, simply change the request URL.
  3. Will this proxy store my API key or request data?
    According to the project's open source code, it only serves as an intermediate layer for request forwarding and format conversion, and does not record or store the user's API key and request content. However, when deployed on a third-party platform, it is still subject to that platform's data and privacy policies.
  4. What are the advantages of using this proxy over calling Gemini or OpenAI's API directly?
    The main advantage is API compatibility for existing Claude clients. If your application or toolchain is built around the Claude API, using this agent gives you the flexibility to replace backend AI models with Gemini or OpenAI without modifying the client code.
0Bookmarked
0kudos

Recommended

Can't find AI tools? Try here!

Just type in the keyword Accessibility Bing SearchYou can quickly find all the AI tools on this site.

Top

en_USEnglish