API Documentation

East Signal provides an OpenAI-compatible REST API. If you've used OpenAI's API before, you already know how to use East Signal — just change the base URL and API key.

Overview

East Signal is an AI API gateway based in Hong Kong, offering low-latency access to Chinese AI models through a unified, OpenAI-compatible interface.

East Signal supports both the Chat Completions API (/v1/chat/completions) and the new Responses API (/v1/responses). We are the only platform offering Responses API access to Chinese AI models.

Authentication

All API requests require an API key passed via the Authorization header:

Authorization: Bearer nvai-your-api-key-here

Get your API key by signing up at aiapi-pro.com. Registration requires only an email address.

Base URL

https://aiapi-pro.com/v1

Replace https://api.openai.com/v1 with the URL above in any existing OpenAI integration.

Available Models

Free model available! Start building immediately with GLM-4.6V-Flash — no credit card required. Click model ID to copy.

Chat Models

Model ID (click to copy)ProviderInputOutputContextNotes
glm-4.6v-flashZhipu AIFreeFree128KFree multimodal FREE
glm-5Zhipu AI$0.70/1M$2.28/1M32KLatest flagship model NEW
glm-5-turboZhipu AI$0.90/1M$3.15/1M128KFaster GLM-5 variant NEW
minimax-text-01MiniMax$0.20/1M$1.60/1M1M1M context window
glm-4.6vZhipu AI$0.40/1M$1.20/1M128KMultimodal vision

Quick Start Example

curl https://aiapi-pro.com/v1/chat/completions \
  -H "Authorization: Bearer nvai-your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "glm-4.6v-flash",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Chat Completions

POST /v1/chat/completions

Creates a model response for the given conversation.

Request Body

ParameterTypeRequiredDescription
modelstringRequiredModel ID to use (see table above)
messagesarrayRequiredList of messages in the conversation
temperaturefloatOptionalSampling temperature (0-2). Default: 1.0
max_tokensintegerOptionalMaximum tokens to generate
streambooleanOptionalEnable streaming responses. Default: false
top_pfloatOptionalNucleus sampling parameter. Default: 1.0

Message Object

FieldTypeDescription
rolestringsystem, user, or assistant
contentstringThe message content

Example Request

curl https://aiapi-pro.com/v1/chat/completions \
  -H "Authorization: Bearer nvai-your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "glm-5",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Hello!"}
    ],
    "temperature": 0.7,
    "max_tokens": 500
  }'

Example Response

{
  "id": "chatcmpl-abc123",
  "object": "chat.completion",
  "created": 1709942400,
  "model": "glm-5",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "Hello! How can I help you today?"
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 20,
    "completion_tokens": 9,
    "total_tokens": 29
  }
}

Responses API NEW — EXCLUSIVE

POST /v1/responses

East Signal is the first and only API platform supporting OpenAI's new Responses API for Chinese AI models. This enables tools like Open Cowork, OpenAI Agents SDK, and any Responses API-based application to use GLM-5, MiniMax, and more China-exclusive models.

Why this matters: Chinese AI providers (Zhipu, MiniMax) only support the legacy /v1/chat/completions endpoint. East Signal automatically translates Responses API format to Chat Completions format, making Chinese models accessible to all modern agent tools.

Request Body

ParameterTypeRequiredDescription
modelstringRequiredModel ID to use
inputstring | arrayRequiredText prompt or message array
instructionsstringOptionalSystem instructions for the model
streambooleanOptionalEnable SSE streaming. Default: false
temperaturefloatOptionalSampling temperature (0-2). Default: 0.7
max_output_tokensintegerOptionalMaximum tokens to generate
toolsarrayOptionalFunction/tool definitions
tool_choicestringOptionalauto, required, or none

Example: Simple Request

curl https://aiapi-pro.com/v1/responses \
  -H "Authorization: Bearer nvai-your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "glm-5",
    "input": "Explain quantum computing simply.",
    "stream": false
  }'

Example: Conversation with Instructions

curl https://aiapi-pro.com/v1/responses \
  -H "Authorization: Bearer nvai-your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "glm-5",
    "instructions": "You are a senior code reviewer.",
    "input": [
      {"role": "user", "content": "Review: def add(a,b): return a+b"}
    ],
    "stream": true
  }'

Example Response

{
  "id": "resp_abc123...",
  "object": "response",
  "status": "completed",
  "model": "glm-5",
  "output": [
    {
      "type": "message",
      "role": "assistant",
      "content": [
        {"type": "output_text", "text": "Quantum computing uses..."}
      ]
    }
  ],
  "usage": {
    "input_tokens": 12,
    "output_tokens": 150,
    "total_tokens": 162
  }
}

Compatible Tools

The following tools work out of the box with East Signal's Responses API endpoint:

Read the full guide: Responses API for Chinese Models →

Streaming

Set "stream": true to receive responses as Server-Sent Events (SSE):

curl https://aiapi-pro.com/v1/chat/completions \
  -H "Authorization: Bearer nvai-your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "glm-5",
    "messages": [{"role": "user", "content": "Hello"}],
    "stream": true
  }'

Each SSE event contains a JSON chunk with delta.content for the new token. The stream ends with [DONE].

List Models

GET /v1/models

Returns a list of all available models.

curl https://aiapi-pro.com/v1/models \
  -H "Authorization: Bearer nvai-your-key"

Error Codes

HTTP CodeMeaningDescription
400Bad RequestInvalid request body or missing required fields
401UnauthorizedInvalid or missing API key
402Insufficient BalanceAccount balance too low (does not apply to free models)
404Not FoundInvalid model ID
429Rate LimitedToo many requests, please slow down
500Server ErrorInternal error, please retry

Rate Limits

Current rate limits per API key:

Rate limits may be adjusted. If you need higher limits, contact us.

SDKs & Libraries

East Signal works with any OpenAI-compatible SDK. Just set the base URL:

Python

pip install openai

from openai import OpenAI
client = OpenAI(api_key="nvai-...", base_url="https://aiapi-pro.com/v1")

Node.js

npm install openai

import OpenAI from 'openai';
const client = new OpenAI({ apiKey: 'nvai-...', baseURL: 'https://aiapi-pro.com/v1' });

LangChain

from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="glm-5", openai_api_key="nvai-...", openai_api_base="https://aiapi-pro.com/v1")

Cursor IDE / Continue

Set base URL to https://aiapi-pro.com/v1 in your IDE settings. See our Cursor setup guide.

More examples: GitHub Repository | Blog & Tutorials