Since the launch of the Directus AI chat sidebar, several people have mentioned wanting to use lower-cost models with AI chat.
Z.ai’s GLM 4.7 model supports an Anthropic-compatible endpoint. It can be used with Directus AI chat by setting the ANTHROPIC_BASE_URL environment variable to https://api.z.ai/api/anthropic/v1, then using your z.ai key as the Anthropic API key in the Directus AI settings.
If the AI responds at all, you’re using GLM. This doesn’t change the options in the list, it just proxy’s everything to GLM (which is already trying to impersonate the Claude models through their Anthropic-compatible endpoint)