Overseas access: www.kdjingpai.com
Bookmark Us

AI Proxy Worker is a serverless proxy script deployed on Cloudflare Workers. It is primarily used to solve a core problem: how to securely invoke AI modeling services in front-end applications or clients without exposing critical API keys. This tool stores the user's API key securely in Cloudflare's server environment, and the front-end application accesses the AI model indirectly by invoking this deployed proxy service. In this way, there is no need to include any sensitive key information in the client code, thus avoiding the risk of key compromise. The project leverages Cloudflare's worldwide edge network to enable very fast request responses. It currently supports DeepSeek's series of models and plans to support more mainstream AI service providers such as OpenAI and Claude in the future, providing developers with a unified, secure and efficient AI interface calling solution.

Function List

  • API key security isolation: The API keys provided by AI service providers are stored on the Cloudflare server side, and the client does not need to touch them, completely eliminating the risk of front-end key leakage.
  • Fast response in milliseconds: With Cloudflare's global edge network, proxy services can handle user requests close to home, achieving extremely low latency.
  • Streaming support: Full support for SSE (Server-Sent Events) streaming responses allows AI chat apps to get typewriter-like real-time conversations.
  • Out of the box for production environments: The project has built-in perfect error handling, security protection and log monitoring mechanisms for high stability.
  • Zero cost to start: It can be run entirely with Cloudflare Workers' free credits, with essentially no server costs for individual developers or small projects.
  • Multi-model support: The current version has full support for DeepSeek's general-purpose dialog and complex inference models, with future plans to expand support for a variety of AI services such as OpenAI, Claude and Gemini.
  • Customized Access Control: Users can set a separate access key for their proxy service (PROXY_KEY) to ensure that only authorized applications can use it.

Using Help

The design goal of AI-Proxy-Worker is to make the deployment and usage process as simple as possible. Instead of managing complex back-end servers, developers need only a Cloudflare account and an API key provided by the AI service provider to build a secure and efficient private AI proxy in minutes.

Pre-deployment preparation

Before you begin, make sure you have the following two things ready:

  1. A Cloudflare account: Cloudflare Workers is the basis for deploying this proxy service. If you don't have an account yet, you can head over to the Cloudflare website and sign up for free.
  2. DeepSeek API Key: You need to get your API key from DeepSeek Open Platform. This is the credential used by the proxy service to request official models.

Deployment method one: using the command line (recommended)

Available through CloudflarewranglerCommand line tools for deployment are the most flexible and fastest way.

Step 1: Install and login to Wrangler
Wrangler is the official Cloudflare command line tool for managing Workers projects. If you don't have a Node.js environment on your computer, install it first.

  1. Open your terminal (command line tool).
  2. Execute the following command to globally installwrangler::
    npm install -g wrangler
    
  3. Once the installation is complete, execute the login command and it will open a browser window for you to authorize login to your Cloudflare account:
    wrangler login
    

Step 2: Get and enter the program

  1. utilizationgitClone this project to your local computer:
    git clone https://github.com/imnotnoahhh/AI-Proxy-Worker.git
    
  2. Go to the directory of the project you just cloned:
    cd AI-Proxy-Worker
    

Step 3: Configure the key
The proxy service requires two key pieces of key information, which we need to pass through thewranglersecure way to configure them into the Cloudflare environment.

  1. Configuring the DeepSeek API Key: Execute the following command. After the prompt, paste the API key you got officially from DeepSeek and press enter.
    wrangler secret put DEEPSEEK_API_KEY
    
  2. Configure proxy access keys (recommended): To prevent abuse of your proxy service, it is highly recommended to set a customized access password. This password is something you set yourself, such as a complex random string.
    wrangler secret put PROXY_KEY
    

    After the prompt, enter the access password you set and press Enter.

Step 4: Deploy to Cloudflare
After completing the above steps, the proxy service can be published to Cloudflare's global network with a single command:

wrangler publish

After a successful deployment, the terminal will display your proxy service address, usually in the format of https://ai-proxy-worker.<你的子域名>.workers.dev. Please take down this address.

Deployment mode 2: one-click deployment using a web page

If you are not familiar with command line operations, you can also complete the deployment directly through Cloudflare's web interface.

  1. Visit the project's GitHub page.
  2. Click the "Deploy to Cloudflare Workers" button on the page.
  3. The page will jump to the Cloudflare deployment interface, and the system will automatically create a new copy of your GitHub repository for you.
  4. Follow the page prompts to authorize Cloudflare to access your GitHub repository.
  5. On the Deployment Settings page, find the Environment Variables or Keys configuration section.
  6. Add two key entries:
    • DEEPSEEK_API_KEY: The value is your official DeepSeek key.
    • PROXY_KEY: The value is the proxy access password you set for yourself.
  7. Click the "Deploy" button and wait for Cloudflare to finish building and publishing.

How to call and test proxy services

After successful deployment, your proxy service is now available globally. You can use any HTTP client tool (such as thecurlor Postman) or call it in your own application code.

The interface path to the proxy service is /chat. There are two key HTTP headers you need to provide when calling:

  • Content-Typeapplication/json
  • AuthorizationBearer <你的PROXY_KEY> (Here's<你的PROXY_KEY>To replace the access password with the one you set up when you deployed).

utilizationcurlExamples of tests performed:
Open a terminal and replace thehttps://your-worker.workers.devReplace it with your actual proxy address, and replace theYOUR_PROXY_KEYReplace it with the access password you set.

curl -X POST https://your-worker.workers.dev/chat \
-H "Authorization: Bearer YOUR_PROXY_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-chat",
"messages": [{"role": "user", "content": "你好!请介绍一下你自己。"}]
}'

If all is well, you will receive a response from the DeepSeek model.

Integration in JavaScript front-end applications:
You can use thefetchfunction to invoke the proxy service.

async function callMyAIProxy() {
const proxyUrl = 'https://your-worker.workers.dev/chat'; // 你的代理地址
const proxyKey = 'YOUR_PROXY_KEY'; // 你的访问密码
const response = await fetch(proxyUrl, {
method: 'POST',
headers: {
'Authorization': `Bearer ${proxyKey}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'deepseek-chat',
messages: [{ role: 'user', content: '你好!' }]
})
});
const data = await response.json();
console.log(data.choices[0].message.content);
}
callMyAIProxy();

With these steps, you have an AI proxy service that is completely under your control, secure and high performing.

application scenario

  1. Web and mobile application integration
    This is an ideal solution for developers who need to integrate AI chat, content generation, and other features in a web or mobile app. Developers can embed AI features directly into their products without worrying that the code deployed on the client side will expose API keys, thus securing the account.
  2. Internal tools and automated processes
    Enterprises or teams can use this agent to provide a unified AI capability portal for internal tools (e.g., customer service systems, document analysis tools). By setting the proxy access key, the calling privileges of different internal applications to AI resources can be effectively controlled for easy management.
  3. Resolving Network Access Restrictions
    For regions or network environments that do not have direct access to a specific AI service (e.g., OpenAI), this proxy can be deployed on a Cloudflare network that has normal access. Afterwards, client applications can indirectly use the target AI service by simply accessing this unrestricted proxy address.
  4. Education and prototyping
    Students, researchers or startup teams prototyping AI applications can use this zero-cost solution to quickly validate ideas. The deployment process is simple, with no servers to maintain, allowing you to focus entirely on application innovation.

QA

  1. Is this proxy tool free?
    The AI-Proxy-Worker project itself is open source and free. It runs on the Cloudflare Workers service, which offers a very generous free package (e.g., 100,000 requests per day) that is perfectly adequate for the vast majority of individual developers and small projects, so it can be used at zero cost.
  2. Why not just call the official AI API on the front end?
    The main reason is security. If you write API keys directly in the front-end code, anyone who visits your website can easily see your keys through the developer tools of the browser. Once the key is compromised, others can steal your credits, incur unnecessary fees, or even abuse the service.AI-Proxy-Worker completely avoids this problem by keeping the key on the server side.
  3. How does it differ from Cloudflare's official AI Gateway?
    Cloudflare AI Gateway is a more full-featured AI gateway that offers advanced features such as caching, rate limiting, analytics, and logging, and is better suited for large enterprises managing complex AI applications. AI-Proxy-Worker, on the other hand, is a lightweight, security-focused proxy and API forwarding solution that is extremely easy to deploy and completely free, making it more suitable for developers to quickly and cost-effectively address core security calls.
  4. Can I use it to proxy for other models besides DeepSeek?
    The current version is mainly adapted for DeepSeek API. However, according to the project's development roadmap, there are plans to add support for more mainstream AI models such as OpenAI, Claude, Gemini, etc. in the future, with the ultimate goal of becoming a universal AI API agent.
  5. Does deploying this Worker violate Cloudflare's Terms of Service?
    Non-Violation. This type of use is application-layer data forwarding, relaying legitimate API requests to third-party services that you are authorized to access. This is fundamentally different from generic web proxies (e.g. VPN traffic obfuscation) which are prohibited by Cloudflare and are normal usage scenarios allowed by their platform.
0Bookmarked
0kudos

Recommended

Can't find AI tools? Try here!

Just type in the keyword Accessibility Bing SearchYou can quickly find all the AI tools on this site.

Top

en_USEnglish