Liquid theme: detect search-engine bots to serve server-rendered collection HTML vs JS-rendered content?

Short description of issue

On our collection pages, the product grid/facets are updated by JavaScript by fetching a Liquid-rendered section (?section_id=…) and replacing parts of the DOM. On initial page load, users see a loader/skeleton first and the full product grid appears after JS runs. We’re concerned some crawlers may not execute JS reliably, so important content might not be fully seen for SEO.

Reproduction steps

Reproduction steps:

  1. Open a collection page with JavaScript disabled (or simulate a crawler).

  2. Initial HTML mainly shows loader/skeleton (or incomplete content).

  3. Product grid only appears after JS fetches the section HTML and injects it.

Additional info

Questions:

  1. Can Shopify themes/Liquid access request headers (like User-Agent) to detect Googlebot/Bingbot and conditionally render server-side HTML?

  2. If theme/Liquid can’t read User-Agent, what’s the recommended Shopify-native approach to keep collection content crawlable?

  3. If bot detection must happen outside Shopify (CDN/edge/app proxy), what’s the recommended method to verify a “real” Googlebot vs a spoofed user agent?

What type of topic is this

Troubleshooting

Hey @yzzaj - thanks for reaching out. Right now, the only place a user agent is referred to in Liquid is here: Liquid objects: user_agent (related to its integration into robots.txt.liquid), so there’s not an easy way to grab those request headers from within Liquid alone.

The easiest workaround that I’d suggest would be to render the full product grid in the initial Liquid page, and then let JS take over after for dynamic filter updates via the Section Rendering API if possible. I definitely get it’s not ideal though, so let me know if I can clarify anything on our end here.