z-node / docs
← Docs indexAider
Setup guide for using Z-Node with Aider.
Official docs
Start with Aider's official OpenAI-compatible API and advanced model settings docs. They document the OPENAI_API_BASE override, openai/ model prefix, .aider.model.metadata.json, and .aider.model.settings.yml:
- https://github.com/Aider-AI/aider
- https://aider.chat/docs/llms/openai-compat.html
- https://aider.chat/docs/config/adv-model-settings.html
Z-Node example (.aider.conf.yml, .aider.model.metadata.json, .aider.model.settings.yml)
Copy this into your project root. It keeps your key in the environment, defaults to gpt-5.5, and hardcodes Aider's local model metadata/settings for Z-Node:
cat > .aider.conf.yml <<'YAML'
model: openai/gpt-5.5
openai-api-base: https://api.z-node.com/v1
model-metadata-file: .aider.model.metadata.json
model-settings-file: .aider.model.settings.yml
YAML
cat > .aider.model.metadata.json <<'JSON'
{
"openai/gpt-5.5": {
"max_tokens": 128000,
"max_input_tokens": 1050000,
"max_output_tokens": 128000,
"litellm_provider": "openai",
"mode": "chat"
},
"openai/gpt-5.4": {
"max_tokens": 128000,
"max_input_tokens": 1050000,
"max_output_tokens": 128000,
"litellm_provider": "openai",
"mode": "chat"
},
"openai/gpt-5.4-mini": {
"max_tokens": 128000,
"max_input_tokens": 400000,
"max_output_tokens": 128000,
"litellm_provider": "openai",
"mode": "chat"
},
"openai/gpt-5.3-codex": {
"max_tokens": 128000,
"max_input_tokens": 400000,
"max_output_tokens": 128000,
"litellm_provider": "openai",
"mode": "chat"
},
"openai/gpt-5.2": {
"max_tokens": 128000,
"max_input_tokens": 400000,
"max_output_tokens": 128000,
"litellm_provider": "openai",
"mode": "chat"
}
}
JSON
cat > .aider.model.settings.yml <<'YAML'
- name: openai/gpt-5.5
edit_format: diff
weak_model_name: openai/gpt-5.4-mini
use_repo_map: true
use_temperature: false
accepts_settings: ["reasoning_effort"]
- name: openai/gpt-5.4
edit_format: diff
weak_model_name: openai/gpt-5.4-mini
use_repo_map: true
use_temperature: false
accepts_settings: ["reasoning_effort"]
- name: openai/gpt-5.4-mini
edit_format: diff
weak_model_name: openai/gpt-5.4-mini
use_repo_map: true
use_temperature: false
- name: openai/gpt-5.3-codex
edit_format: diff
weak_model_name: openai/gpt-5.4-mini
use_repo_map: true
use_temperature: false
accepts_settings: ["reasoning_effort"]
overeager: true
- name: openai/gpt-5.2
edit_format: diff
weak_model_name: openai/gpt-5.4-mini
use_repo_map: true
use_temperature: false
accepts_settings: ["reasoning_effort"]
- name: openai/codex-auto-review
edit_format: diff
weak_model_name: openai/gpt-5.4-mini
use_repo_map: true
use_temperature: false
YAMLAcceptable Z-Node model ids
Use these as openai/<model id> in Aider:
gpt-5.5(default)gpt-5.4gpt-5.4-minigpt-5.3-codexgpt-5.2codex-auto-review
Smoke test
OPENAI_API_KEY="$ZNODE_ISSUED_API_KEY" \
aider --message "Reply with exactly: Z-NODE-AIDER-OK" \
--yes-always --no-git --no-auto-commits --no-dirty-commits --no-pretty