is there a nuxt config preset to do do ssr only if user agent is is crawer or robot (google etc) ?
ssr is reducing the performance of my app I don't want it but need it for SEO
2 Replies
Hi @julienreszka. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave:
----------------------
Short answer: there’s no single “preset” toggle to do SSR only for crawlers. However, Nuxt/Nitro supports hybrid rendering and route rules, and you can conditionally disable SSR via an experimental header or custom logic.
What you can use from the docs:
- Hybrid rendering via routeRules (per-route prerender/SSR/static), but not user-agent–based out of the box. [Server]
- Experimental respectNoSSRHeader lets you turn off SSR when a request carries x-nuxt-no-ssr, which you could set via upstream middleware/edge logic based on user agent. [respectNoSSRHeader] Context you may find helpful: - Selective prerendering (SSG) for most pages and keep API/runtime for others using Nitro prerender config or routeRules. This can minimize SSR usage while preserving SEO for key pages. [Prerendering; Deployment]
- A community video shows “dynamic rendering” (SSR for crawlers, SPA for users) as a concept in Nuxt, but it’s not an official preset; you’d implement it yourself (e.g., a middleware that detects bots and toggles SSR vs. SPA). [Dynamic rendering] If you want SSR only for crawlers today: - Implement user-agent detection (at a proxy/edge or Nitro middleware) and set x-nuxt-no-ssr for non-bot traffic with experimental.respectNoSSRHeader: true, so humans get CSR and bots get SSR. [respectNoSSRHeader]
- Combine with prerendering key routes for performance where possible. [Prerendering] ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------
- Experimental respectNoSSRHeader lets you turn off SSR when a request carries x-nuxt-no-ssr, which you could set via upstream middleware/edge logic based on user agent. [respectNoSSRHeader] Context you may find helpful: - Selective prerendering (SSG) for most pages and keep API/runtime for others using Nitro prerender config or routeRules. This can minimize SSR usage while preserving SEO for key pages. [Prerendering; Deployment]
- A community video shows “dynamic rendering” (SSR for crawlers, SPA for users) as a concept in Nuxt, but it’s not an official preset; you’d implement it yourself (e.g., a middleware that detects bots and toggles SSR vs. SPA). [Dynamic rendering] If you want SSR only for crawlers today: - Implement user-agent detection (at a proxy/edge or Nitro middleware) and set x-nuxt-no-ssr for non-bot traffic with experimental.respectNoSSRHeader: true, so humans get CSR and bots get SSR. [respectNoSSRHeader]
- Combine with prerendering key routes for performance where possible. [Prerendering] ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------
<@313741476951752705> Kapa.ai is still learning and improving, please let me know how I did by reacting below.