Robots.txt Now Editable on Shopify
It was quietly announced this week that Shopify has given merchants the ability to edit the robots.txt file on Shopify websites. Until now, Shopify has restricted access to the robots.txt file, much to the frustration of SEO’s across the world. This is a very welcome change that will enable merchants to have more control over which areas of their stores are and aren’t crawled, which has several SEO benefits.
What Is Robots.txt?
The robots.txt is a text file that should be located in the root of a website.
It’s a file that’s used to instruct search engine bots which areas of websites they should and shouldn’t crawl and index. The robots.txt file contains a list of instructions for search engine bots/crawlers as to what can and cannot be crawled on the website. The file simply specifies a ‘User-agent’ (which is the bot) and then lists any parts of the website that can/cannot be accessed.
The purpose of the robots.txt file is to prevent bots from crawling parts of your site that shouldn’t be accessed. This could be areas like your checkout, admin area, or specific types of URL parameters.
Robots.txt is a simple text file but has a significant influence on how search engines interact with your website. If you were to make a mistake in your robots.txt file, you could inadvertently prevent Google from accessing your website. Clearly this would have a terrible effect on your search engine rankings.
What Has Shopify Changed?
Until now, every Shopify website has used the default robots.txt file. There was no access to the robots.txt file. This has now changed.
There is a post on the Shopify website developers section that outlines how to edit the Shopify robots.txt. The annual Shopify Unite conference is at the end of this month and it may be discussed in more detail then.
What Does This Mean for Shopify Merchants?
The default Shopify robots.txt is effective at controlling which parts of a Shopify website bots can crawl. But Shopify merchants haven’t been able to edit the robots.txt so have had to work around the limitations of the default robots.txt.
This has been really frustrating for SEO’s trying to improve the organic performance of their Shopify websites. Some of the common frustrations caused by no access to Shopify’s robots.txt:
- Not having the ability to add additional robots.txt instructions
- Issues created by the default robots.txt disallow rules. For example, the default Shopify robots.txt disallows bots from crawling collections that contain a ‘+’ symbol in the URL. There are occasions when you might want search engines to crawl one of these URLs but the robots.txt block this.
Now that Shopify has made the robots.txt editable, merchants have the ability to optimise the robots.txt file. This means SEO’s can amend the default robots.txt rules and create new crawling instructions.
You can now theoretically give search engines access to parts of your Shopify store that were previously uncrawlable. You can also block search engines from crawling specific types of URLs on Shopify. This update introduces fantastic new ways to improve the SEO on your Shopify website.
How Do You Edit Shopify’s Robots.txt?
An extreme word of caution first. You should only make changes to your robots.txt if you know exactly what you’re doing. A simple typo in the robots.txt could block search engines from accessing your site and have serious SEO implications.
Full details on how to edit your Shopify robots.txt can be found on the developers section of the Shopify website.
Of course, we’re available if you’d like to talk to us about how our SEO team can help improve your Shopify store’s search visibility.