When running a Shopify store, one of the essential tools for SEO is the robots.txt
file. This file guides search engine crawlers on which parts of your site they should or shouldn’t index. Modifying or adding rules to this file can help you manage your store's visibility on search engines like Google.
In this guide, we’ll show you how to add or modify the robots.txt
file in your Shopify store.
What is robots.txt
?
The robots.txt
file is a standard used by websites to communicate with web crawlers. It tells search engine bots like Google, Bing, and others which pages they should or should not crawl. This file can improve SEO by preventing duplicate content from being indexed and by controlling the crawler’s access to certain parts of your site.
Why Modify robots.txt
?
You might want to modify robots.txt
for several reasons:
- Control Crawling: Prevent search engines from indexing certain pages (e.g., checkout, cart, account pages).
- Improve SEO: Optimize crawl budgets by guiding search engines to focus on your important pages.
- Exclude Duplicate Content: Prevent crawlers from indexing duplicate content, ensuring better ranking for your key pages.
Shopify’s robots.txt.liquid
As of 2021, Shopify allows merchants to customize the robots.txt
file using robots.txt.liquid
, making it easier to modify without needing to code from scratch. This template file allows you to:
- Add or remove rules for different bots.
- Disallow certain URLs from being crawled.
- Specify crawl-delay for certain bots.
Steps to Add or Modify robots.txt
in Shopify
1. Access the robots.txt.liquid
File
To modify the robots.txt
file in Shopify, follow these steps:
- Login to your Shopify Admin.
- Navigate to Online Store > Themes.
- Click on Actions > Edit Code for the theme you are using.
- In the Templates section, click on Add a new template and select robots.txt. Name the new file
robots.txt.liquid
.
This will create a customizable robots.txt
file in your theme.
2. Add Rules to Your robots.txt.liquid
File
The robots.txt.liquid
file works like any other Liquid file in Shopify, allowing you to add or modify rules. Below are some examples of what you might want to include.
-
Blocking Specific Pages: To block certain pages like the cart or checkout page, add the following code:
User-agent: * Disallow: /cart Disallow: /checkout
-
Blocking Specific Search Engine Crawlers: If you want to block specific crawlers (e.g., Bingbot), you can do so by specifying the user-agent:
User-agent: Bingbot Disallow: /
-
Allow Specific Crawlers: You may also want to allow specific bots to crawl your site more frequently while setting crawl-delay for others:
User-agent: Googlebot Allow: / User-agent: * Crawl-delay: 10
3. Save the File
After making the necessary changes, click Save in the top right corner to apply the modifications to your store’s robots.txt
.
4. Check Your robots.txt
File
Once your changes are saved, you can check the live robots.txt
file by navigating to https://yourstore.com/robots.txt
. This URL will show you the robots.txt
file that search engines will use to crawl your site.
Common Use Cases for Modifying robots.txt
-
Blocking Internal Pages: Pages such as
/cart
,/checkout
, or/account
are internal and should not be indexed by search engines.User-agent: * Disallow: /cart Disallow: /checkout Disallow: /account
-
Blocking Duplicate Content: If you have duplicate content that you don’t want crawled, you can disallow the URLs:
User-agent: * Disallow: /collections/all?sort_by=best-selling
-
Preventing Image Indexing: If you don’t want your product images to appear in search results, you can block crawlers from indexing your images:
User-agent: * Disallow: /images/
-
Allowing Specific Bots: Sometimes you might want to prioritize certain bots, like Googlebot, for faster crawling of important pages:
User-agent: Googlebot Allow: / User-agent: * Disallow: /
Best Practices
- Use Specific Rules: Avoid blocking too much content, as it may hurt your store’s visibility. Only block the pages that shouldn’t be indexed (like cart and checkout pages).
-
Test Changes: After modifying
robots.txt
, test the changes using Google Search Console’s “robots.txt Tester” tool to ensure it works as expected. -
Use
noindex
for SEO Control: If you want more precise control over indexing, consider adding anoindex
meta tag to the page itself, in addition to usingrobots.txt
.
Conclusion
Modifying your Shopify store’s robots.txt
file allows you to control how search engines crawl and index your website. With the robots.txt.liquid
template, Shopify provides an easy way to customize the file to suit your SEO strategy. Follow the steps in this guide to make sure your store is properly optimized for search engines while protecting sensitive or irrelevant pages from being indexed.
Stay tuned to Dawnify for more tips on optimizing your Shopify store!