Since the soft launch of our new webshop in August 2025 we have been facing severe indexing issues

Hi guys,

I’m really out of options and knowledge, so i’m trying to get help of other users, who may have faced similar issues. Underneath i have written a issue description which may not be directly related to Directus itself, but moretheless it could be how we are using Directus (are we doing it wrong or?).

Issue Description

Since the soft launch of our new webshop in August 2025 (over 6 months ago), we have been facing severe indexing issues. Despite a technically sound setup using Nuxt with Server-Side Rendering (SSR) through Directus API, Google refuses to index the majority of our product pages and blog posts.

Current status in Google Search Console (GSC):

  • 5xx Server Errors: GSC reports server errors that do not appear in our actual server logs or monitoring tools.

  • “Crawled - currently not indexed”: The bulk of our URLs are stuck in this status.

  • Live Test Success: The “Live Test” in the URL Inspection Tool almost always returns a green “URL is available to Google” status. However, manual indexing requests do not lead to actual indexing.

Timeline & Actions Taken

  1. History & Structure
  • Test Environment: Early in the project, a test environment on a subdomain was accidentally indexed. A manual removal request was filed and successful. That subdomain now returns a 410 Gone status to ensure Google drops it.

  • URL Consistency: All internal links and canonicals have been standardized with a trailing slash (/) to maintain 100% consistency with the XML sitemap.

  • Domain Property: We are monitoring both the Domain property and the specific URL-prefix property.

2. Technical Optimization (Nuxt SSR)

  • SSR Improvements: Category and search pages now deliver full HTML (H1, intro text, and product grid) on the initial request.

  • HTML Internal Links: Carousels and related products now use fallback grids in the source code, ensuring Googlebot sees the links without needing to execute JavaScript.

  • Fail-soft Data: Implemented fallbacks for navigation and footers in case of API latency to prevent “empty” renders.

  • Nuxt Config: Disabled prefetching to eliminate “speculation refused” (503) noise on the CDN/WAF layer.

3. Content & Validation

  • Structured Data: Full JSON-LD implementation (BreadcrumbList, ItemList, and Product/AggregateOffer schema) – all validated.

  • Audit Results (400 URLs):

    • 200 OK: 98.5%

    • Noindex signals: 0%

    • Thin SSR HTML: < 2%

  • Active Logging: We have enabled specific logging for the Googlebot User-Agent to catch any 4xx/5xx errors in real-time. So far, no major blocks have been found on our end.

Questions for the Community

Given that the site is technically healthy (confirmed by Screaming Frog and GSC Live Tests), there are no manual actions (penalties), and the content is unique:

  1. “Ghost” 5xx Errors: Why would GSC report 5xx errors that don’t exist in our server logs? Could this be related to rate-limiting of Googlebot IP ranges at the firewall/WAF level that only triggers during high-intensity crawls?

  2. Legacy Impact: Could the previous manual removal of the test subdomain be “throttling” the crawl priority or trust of the main domain?

  3. Property Reset: Is there any merit in deleting and recreating the GSC property, or is that likely a waste of time?

  4. Nuxt/SSR Specifics: Are there known cases where specific Nuxt headers or the way Nuxt handles hydration causes Google to crawl but hesitate to index?

Hi @prikr Welcome to the community! :wave:

Honestly, nothing in your post is raising any red flags for me on the Directus side of things.

The indexing issues you’re experiencing really shouldn’t be related to Directus itself. It’s more about how you’re rendering the content you’re pulling from the Directus API — and it sounds like you’re using Nuxt with SSR, which is actually a good setup for SEO.

As long as your pages aren’t returning error codes (404s, 500s, etc.), Google should be able to index them just fine with SSR or static rendering.

Things I’d Check First

  • robots.txt — Make sure nothing in there is accidentally blocking Googlebot
  • Sitemap — Verify it’s accurate, up to date, and submitted in Google Search Console
  • Meta tags — Double check you don’t have any noindex tags slipping through on pages you want indexed
  • WAF / Middleware — Sometimes security layers can block crawlers without you realizing it

Beyond That

If you’ve gone through all of the above and things still aren’t improving, I’d honestly reach out directly to Google through Search Console. They have tools to inspect URLs and request indexing that can give you more specific insight into what’s going on.

One other thing worth considering — domain trust can play a big role here. If your domain previously had low-quality or spammy content getting indexed, it can take time to rebuild that trust with Google even after you’ve cleaned things up.


That’s about as far as I can take it without more specifics. Hopefully someone else in the community has been through something similar and can share their experience. :crossed_fingers: