Hi, I need some help regarding robots.txt file. I want to allow certain collection filters for crawling, but I’m not sure how to do that, has anyone faced a similar task?
Hey @Petra_Markovic - this link here might help: https://help.shopify.com/en/manual/promoting-marketing/seo/editing-robots-txt
Currently, we only allow modifications to robots.txt that do the following:
- allow or disallow certain URLs from being crawled
- add crawl-delay rules for certain crawlers
- add extra sitemap URLs
- block certain crawlers
For collection filters, is that related to the filters set up on the storefront throug Search and Discovery? https://help.shopify.com/en/manual/online-store/search-and-discovery/filters
Right now, we can’t block specific filters, but you can technically disallow certain specific collection URLs from being crawled, which might be a decent workaround.
Let me know if you encounter any blockers or if you’ve already taken a look at that and I’d be happy to keep digging into things with you here