Use GLM 4.7 with Directus AI chat instead of OpenAI or Anthropic for lower cost

Since the launch of the Directus AI chat sidebar, several people have mentioned wanting to use lower-cost models with AI chat.

Z.ai’s GLM 4.7 model supports an Anthropic-compatible endpoint. It can be used with Directus AI chat by setting the ANTHROPIC_BASE_URL environment variable to https://api.z.ai/api/anthropic/v1, then using your z.ai key as the Anthropic API key in the Directus AI settings.

ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic/v1

I have tested this and tool calling/reasoning works great!

Access to GLM 4.7 currently costs < USD $30/yr for the first year.

2 Likes

I’ve configured like you suggested, but I am not sure how to use it correctly.
So I’ve added the env variable, restarted container, added my API key.

In AI assistant, i can only choose from Claude Haiku and Sonnet.
Asking AI who it is, it confirms it is not GLM :slight_smile:

If the AI responds at all, you’re using GLM. This doesn’t change the options in the list, it just proxy’s everything to GLM (which is already trying to impersonate the Claude models through their Anthropic-compatible endpoint)

Yes, you’re right. It works fine.
I can see tokens in usage chart.

It is just funny how it doesn’t know who it is when asked.
But does what I ask it, although a bit schizophrenic :smiley:

Thanks for the tip. Very helpful and I can enjoy my cheap AI.