[go: up one dir, main page]

Skip to content

🚀 Openai api forward · OpenAI 接口转发服务 · streaming forward

License

Notifications You must be signed in to change notification settings

superad/openai-forward

 
 

Repository files navigation

中文 | English


OpenAI Forward

OpenAI API 接口转发服务
The fastest way to deploy openai api forward proxy

PyPI version License Release (latest by date) GitHub repo size docer image size tests pypi downloads

This project is designed to solve the problem of some regions being unable to directly access OpenAI. The service is deployed on a server that can access the OpenAI API, and OpenAI requests are forwarded through the service, i.e. a reverse proxy service is set up.

Test access: https://caloi.top/v1/chat/completions is equivalent to https://api.openai.com/v1/chat/completions

Table of Contents

Features

  • Supports forwarding of all OpenAI interfaces
  • Supports request IP verification
  • Supports streaming forwarding
  • Supports default API key
  • pip installation and deployment
  • Docker deployment
  • Chat content security: Chat content streaming filtering

Usage

Here, the proxy address set up by the individual, https://caloi.top, is used as an example

Image Generation (DALL-E):

curl --location 'https://caloi.top/v1/images/generations' \ 
--header 'Authorization: Bearer sk-******' \ 
--header 'Content-Type: application/json' \ 
--data '{ 
    "prompt": "A photo of a cat", 
    "n": 1, 
    "size": "512x512"
}' 

Modify the OPENAI_API_BASE_URL in Docker Compose to the address of the proxy service we set up:

OPENAI_API_BASE_URL: https://caloi.top 

Replace BASE_URL in the docker startup command with the address of the proxy service we set up:

docker run -d -p 3000:3000 -e OPENAI_API_KEY="sk-******" -e CODE="<your password>" -e BASE_URL="caloi.top" yidadaa/chatgpt-next-web 

Using in a module

Used in JS/TS

import { Configuration } from "openai"; 

const configuration = new Configuration({ 
    basePath: "https://caloi.top", 
    apiKey: "sk-******", 

}); 

Used in Python

import openai 
openai.api_base = "https://caloi.top" 
openai.api_key = "sk-******" 

Service Deployment

Two service deployment methods are provided, choose one

Method 1: pip

Installation

pip install openai-forward 

Run forwarding service The port number can be specified through --port, which defaults to 8000, and the number of worker processes can be specified through --workers, which defaults to 1.

openai_forward run --port=9999 --workers=1 

The service is now set up, and the usage is to replace https://api.openai.com with the port number of the service http://{ip}:{port}.

Of course, OPENAI_API_KEY can also be passed in as an environment variable as the default API key, so that the client does not need to pass in the Authorization in the header when requesting the relevant route. Startup command with default API key:

OPENAI_API_KEY="sk-xxx" openai_forward run --port=9999 --workers=1 

Note: If both the default API key and the API key passed in the request header exist, the API key in the request header will override the default API key.

Method 2: Docker (recommended)

docker run --name="openai-forward" -d -p 9999:8000 beidongjiedeguang/openai-forward:latest 

The 9999 port of the host is mapped, and the service can be accessed through http://{ip}:9999. Note: You can also pass in the environment variable OPENAI_API_KEY=sk-xxx as the default API key in the startup command.

Service Usage

Simply replace the OpenAI API address with the address of the service we set up, such as:

https://api.openai.com/v1/chat/completions 

Replace with

http://{ip}:{port}/v1/chat/completions 

About

🚀 Openai api forward · OpenAI 接口转发服务 · streaming forward

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 90.3%
  • Makefile 7.1%
  • Dockerfile 2.6%