Robots.txt: A Quick Guide for Shopify Merchants

When optimizing your store’s SEO, you have a lot of tools you can use to tweak and improve how search engines understand your store. One of the simplest available to merchants is robots.txt. This file allows you to tell bots how to crawl your site, and when used correctly can be a really powerful tool in your SEO strategy.

What is robots.txt?

Robots.txt is a file that tells search engine bots how to crawl your website, and which URLs they can access. Crawlers discover content on your store by following links, so they’ll land on a page then follow the links there to find more links and more content to index. If your site has a robots.txt file, the crawlers will read this first and then follow the instructions as to where they should and shouldn't crawl. 


So why is this important? If you have a fairly small store with a few hundred URLs to crawl, then there’s a good chance that Google will be able to find and index all of your content. However if you have a large catalog, or your site has been around for a long time and has lots of content, then that will make it more difficult for all that content to be found. Therefore, you want to make sure that search engines find all the content you want them to index. Your robots.txt can then act as a guide for crawlers to prioritize content and prevent them from crawling resources and pages that aren’t as valuable. 

Shopify and robots.txt

All Shopify stores start with the same default robots.txt, and this is generally fine for most stores. However it has been a major issue for some time that merchants were not able to edit their robots.txt file, and even their SEO Lead, Jackson Lo, commented that this was their most requested SEO feature. In June 2021 Shopify CEO Tobi Lutke announced via Twitter that merchants would now be able to edit their robots.txt file. This gives merchants even more control over their SEO, and how search engine bots crawl their store.


How to edit your robots.txt in Shopify

While Shopify says that their default robots.txt works for most stores, there are some cases in which you may want to customize your store’s Robots.txt file. 


  • Maximize your crawl budget, and focus crawlers on your most important content.
  • Avoid bots crawling duplicate content
  • Keep sections of your site private i.e. staging pages
  • Prevent crawling of internal search results pages
  • Prevent certain files/resources appearing in search results e.g. images, PDFs, etc.

Whatever the reason, it’s important to know how to edit your robots.txt should you need to and thankfully the process is straightforward.


  • From Shopify admin - go to Online Store > Themes.
  • Click Actions > Edit Code.
  • Add a new template, and then select “robots”.
  • Click Create template.
  • Make the changes that you want to the default template.
  • Save changes to the robots.txt.liquid file.

We’ll be covering exactly what you can change in a future article, but for now you can check out Google’s own guide on creating rules for your robots.txt file.

Best practice for robots.txt

Without getting into the specifics of different rules and instructions, there are some general best practices you should take note of if you’re considering customizing your robots.txt file.

1 - Don’t use robots.txt to keep content off Google

A key thing to understand about robots.txt is that it tells crawlers how to crawl your store, but that doesn’t mean they won’t discover a page if the link is included elsewhere on your site or is linked from an external source. Google has said that robots.txt should not be used as a mechanism for keeping a web page off Google. This should instead be done via [noindex] or password-protection. 

2 - Check it twice, check it three times, then check it again

It’s worth noting that before you edit your robots.txt, this is considered an unsupported customization by Shopify so their support team won’t be able to assist should something go wrong. Therefore it’s important to check over your robots.txt file thoroughly before implementing it. Mistakes or errors could run the risk of your store being deindexed altogether. If you need an extra hand, Google has a robots.txt tool that will flag any errors or warnings.

3 - Make it easy for search engines to find your robots.txt file

Your robots.txt should always be placed in the topmost directory of your site, i.e. yoursite.com/robots.txt. It shouldn’t be in a subdirectory or other category of your site. This is so search engine bots can find it quickly and easily when they go to crawl your store.


-----


Your robots.txt is a simple yet highly effective SEO tool. With updates allowing Shopify merchants to edit and customize their robots.txt, you can now take full advantage of this tool, and better control how search engines crawl and index your store’s most valuable content.