How to edit Robots.txt file in Shopify

Last updated:

how to edit robots txt shopify
Home » Articles » Shopify » How to edit Robots.txt file in Shopify

Your robots.txt file gives instructions to search engine crawlers, telling them which pages they can or can’t access on your Shopify site. Shopify automatically generates a default robots.txt file that works for most stores. However, there may be cases where you want to customise it to fine-tune how search engines crawl your store. By customising this file, you can control how search engines interact with your store’s content, which in turn can have a significant impact on your SEO performance. This guide will show you how to edit robots.txt file in Shopify to suit your needs.

the robot octopus

How to edit robots.txt file in Shopify

To view the current file

Most Shopify themes come with a pre-set, optimised robots.txt.liquid template.

To check if yours is present, simply go to your store’s main website and add “/robots.txt” to the end of your domain.

For example, visit yourstore.com/robots.txt (just replace yourstore.com with your actual website address).

To edit the file

The robots.txt.liquid template is typically located in the templates directory of your theme files. You can edit it by opening it in the file editor. Take care to backup your original copy so that it’s easy for you to revert if you make a mistake when editing.

You can add or change rules in your robots.txt file using Liquid objects. These objects help customise the default rules and add your own specifications.

  • robots
  • group
  • rule
  • user-agent
  • sitemap

To add a new rule, you need to modify the Liquid code that outputs the default rules.

Example use cases:

1. Blocking a blog directory:

To block all crawlers from accessing the /blog directory, you can add this rule:

{%- for group in robots.groups -%}
  User-agent: {{ group.user_agent }}
  {%- for rule in group.rules -%}
    Disallow: {{ rule.value }}
  {%- endfor -%}

  {%- if group.user_agent == '*' -%}
    Disallow: /blog/
  {%- endif -%}
{%- endfor -%}

2. Blocking ChatGPT Crawler

If you want to block the ChatGPT crawler specifically, you can add this:

User-agent: ChatGPT
Disallow: /

You can also add custom rules outside of the default groups. For example, you may want to:

To block a specific crawler:

User-agent: discobot Disallow: /

To allow a specific crawler:

User-agent: discobot Allow: /

To add a sitemap URL:

Sitemap: [your-sitemap-url]

Make sure to replace [your-sitemap-url] with your actual sitemap URL.

You can check out the full Liquid documentation for Robots.txt on the Shopify website.

Wave

Enjoy our articles? Join our free list and get more.

Sign Up

Book Discovery Call