Hi guys,
I’m really out of options and knowledge, so i’m trying to get help of other users, who may have faced similar issues. Underneath i have written a issue description which may not be directly related to Directus itself, but moretheless it could be how we are using Directus (are we doing it wrong or?).
Issue Description
Since the soft launch of our new webshop in August 2025 (over 6 months ago), we have been facing severe indexing issues. Despite a technically sound setup using Nuxt with Server-Side Rendering (SSR) through Directus API, Google refuses to index the majority of our product pages and blog posts.
Current status in Google Search Console (GSC):
-
5xx Server Errors: GSC reports server errors that do not appear in our actual server logs or monitoring tools.
-
“Crawled - currently not indexed”: The bulk of our URLs are stuck in this status.
-
Live Test Success: The “Live Test” in the URL Inspection Tool almost always returns a green “URL is available to Google” status. However, manual indexing requests do not lead to actual indexing.
Timeline & Actions Taken
- History & Structure
-
Test Environment: Early in the project, a test environment on a subdomain was accidentally indexed. A manual removal request was filed and successful. That subdomain now returns a 410 Gone status to ensure Google drops it.
-
URL Consistency: All internal links and canonicals have been standardized with a trailing slash (/) to maintain 100% consistency with the XML sitemap.
-
Domain Property: We are monitoring both the Domain property and the specific URL-prefix property.
2. Technical Optimization (Nuxt SSR)
-
SSR Improvements: Category and search pages now deliver full HTML (H1, intro text, and product grid) on the initial request.
-
HTML Internal Links: Carousels and related products now use fallback grids in the source code, ensuring Googlebot sees the links without needing to execute JavaScript.
-
Fail-soft Data: Implemented fallbacks for navigation and footers in case of API latency to prevent “empty” renders.
-
Nuxt Config: Disabled prefetching to eliminate “speculation refused” (503) noise on the CDN/WAF layer.
3. Content & Validation
-
Structured Data: Full JSON-LD implementation (BreadcrumbList, ItemList, and Product/AggregateOffer schema) – all validated.
-
Audit Results (400 URLs):
-
200 OK: 98.5%
-
Noindex signals: 0%
-
Thin SSR HTML: < 2%
-
-
Active Logging: We have enabled specific logging for the Googlebot User-Agent to catch any 4xx/5xx errors in real-time. So far, no major blocks have been found on our end.
Questions for the Community
Given that the site is technically healthy (confirmed by Screaming Frog and GSC Live Tests), there are no manual actions (penalties), and the content is unique:
-
“Ghost” 5xx Errors: Why would GSC report 5xx errors that don’t exist in our server logs? Could this be related to rate-limiting of Googlebot IP ranges at the firewall/WAF level that only triggers during high-intensity crawls?
-
Legacy Impact: Could the previous manual removal of the test subdomain be “throttling” the crawl priority or trust of the main domain?
-
Property Reset: Is there any merit in deleting and recreating the GSC property, or is that likely a waste of time?
-
Nuxt/SSR Specifics: Are there known cases where specific Nuxt headers or the way Nuxt handles hydration causes Google to crawl but hesitate to index?