API Documentation
East Signal provides an OpenAI-compatible REST API. If you've used OpenAI's API before, you already know how to use East Signal — just change the base URL and API key.
Overview
East Signal is an AI API gateway based in Hong Kong, offering low-latency access to Chinese AI models through a unified, OpenAI-compatible interface.
Authentication
All API requests require an API key passed via the Authorization header:
Authorization: Bearer nvai-your-api-key-here
Get your API key by signing up at aiapi-pro.com. Registration requires only an email address.
Base URL
https://aiapi-pro.com/v1
Replace https://api.openai.com/v1 with the URL above in any existing OpenAI integration.
Available Models
Chat Models
| Model ID (click to copy) | Provider | Input | Output | Context | Notes |
|---|---|---|---|---|---|
| glm-4.6v-flash | Zhipu AI | Free | Free | 128K | Free multimodal FREE |
| glm-5 | Zhipu AI | $0.70/1M | $2.28/1M | 32K | Latest flagship model NEW |
| glm-5-turbo | Zhipu AI | $0.90/1M | $3.15/1M | 128K | Faster GLM-5 variant NEW |
| minimax-text-01 | MiniMax | $0.20/1M | $1.60/1M | 1M | 1M context window |
| glm-4.6v | Zhipu AI | $0.40/1M | $1.20/1M | 128K | Multimodal vision |
Quick Start Example
curl https://aiapi-pro.com/v1/chat/completions \
-H "Authorization: Bearer nvai-your-key" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-4.6v-flash",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Chat Completions
POST /v1/chat/completions
Creates a model response for the given conversation.
Request Body
| Parameter | Type | Required | Description |
|---|---|---|---|
| model | string | Required | Model ID to use (see table above) |
| messages | array | Required | List of messages in the conversation |
| temperature | float | Optional | Sampling temperature (0-2). Default: 1.0 |
| max_tokens | integer | Optional | Maximum tokens to generate |
| stream | boolean | Optional | Enable streaming responses. Default: false |
| top_p | float | Optional | Nucleus sampling parameter. Default: 1.0 |
Message Object
| Field | Type | Description |
|---|---|---|
| role | string | system, user, or assistant |
| content | string | The message content |
Example Request
curl https://aiapi-pro.com/v1/chat/completions \
-H "Authorization: Bearer nvai-your-key" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-5",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
"temperature": 0.7,
"max_tokens": 500
}'
Example Response
{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"created": 1709942400,
"model": "glm-5",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 20,
"completion_tokens": 9,
"total_tokens": 29
}
}
Responses API NEW — EXCLUSIVE
POST /v1/responses
East Signal is the first and only API platform supporting OpenAI's new Responses API for Chinese AI models. This enables tools like Open Cowork, OpenAI Agents SDK, and any Responses API-based application to use GLM-5, MiniMax, and more China-exclusive models.
Request Body
| Parameter | Type | Required | Description |
|---|---|---|---|
| model | string | Required | Model ID to use |
| input | string | array | Required | Text prompt or message array |
| instructions | string | Optional | System instructions for the model |
| stream | boolean | Optional | Enable SSE streaming. Default: false |
| temperature | float | Optional | Sampling temperature (0-2). Default: 0.7 |
| max_output_tokens | integer | Optional | Maximum tokens to generate |
| tools | array | Optional | Function/tool definitions |
| tool_choice | string | Optional | auto, required, or none |
Example: Simple Request
curl https://aiapi-pro.com/v1/responses \
-H "Authorization: Bearer nvai-your-key" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-5",
"input": "Explain quantum computing simply.",
"stream": false
}'
Example: Conversation with Instructions
curl https://aiapi-pro.com/v1/responses \
-H "Authorization: Bearer nvai-your-key" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-5",
"instructions": "You are a senior code reviewer.",
"input": [
{"role": "user", "content": "Review: def add(a,b): return a+b"}
],
"stream": true
}'
Example Response
{
"id": "resp_abc123...",
"object": "response",
"status": "completed",
"model": "glm-5",
"output": [
{
"type": "message",
"role": "assistant",
"content": [
{"type": "output_text", "text": "Quantum computing uses..."}
]
}
],
"usage": {
"input_tokens": 12,
"output_tokens": 150,
"total_tokens": 162
}
}
Compatible Tools
The following tools work out of the box with East Signal's Responses API endpoint:
- Open Cowork — Set Base URL to https://aiapi-pro.com/v1
- OpenAI Agents SDK — Pass East Signal client to Runner
- OpenAI Python SDK (responses mode) — Set base_url
- Any custom application using POST /v1/responses
Read the full guide: Responses API for Chinese Models →
Streaming
Set "stream": true to receive responses as Server-Sent Events (SSE):
curl https://aiapi-pro.com/v1/chat/completions \
-H "Authorization: Bearer nvai-your-key" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-5",
"messages": [{"role": "user", "content": "Hello"}],
"stream": true
}'
Each SSE event contains a JSON chunk with delta.content for the new token. The stream ends with [DONE].
List Models
GET /v1/models
Returns a list of all available models.
curl https://aiapi-pro.com/v1/models \
-H "Authorization: Bearer nvai-your-key"
Error Codes
| HTTP Code | Meaning | Description |
|---|---|---|
| 400 | Bad Request | Invalid request body or missing required fields |
| 401 | Unauthorized | Invalid or missing API key |
| 402 | Insufficient Balance | Account balance too low (does not apply to free models) |
| 404 | Not Found | Invalid model ID |
| 429 | Rate Limited | Too many requests, please slow down |
| 500 | Server Error | Internal error, please retry |
Rate Limits
Current rate limits per API key:
- Free models (glm-4.6v-flash): 30 requests/minute
- Paid models: 60 requests/minute
SDKs & Libraries
East Signal works with any OpenAI-compatible SDK. Just set the base URL:
Python
pip install openai
from openai import OpenAI
client = OpenAI(api_key="nvai-...", base_url="https://aiapi-pro.com/v1")
Node.js
npm install openai
import OpenAI from 'openai';
const client = new OpenAI({ apiKey: 'nvai-...', baseURL: 'https://aiapi-pro.com/v1' });
LangChain
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(model="glm-5", openai_api_key="nvai-...", openai_api_base="https://aiapi-pro.com/v1")
Cursor IDE / Continue
Set base URL to https://aiapi-pro.com/v1 in your IDE settings. See our Cursor setup guide.
More examples: GitHub Repository | Blog & Tutorials