How To Disable Bots Crawling Cart Links

Posted on by AJ Morris
Reading Time: < 1 minute

Having search engines crawl add-to-cart links and other unwanted pages can damage your SEO rankings. Add-to-links can cause more specific issues because those pages are not cached, and this can also increase your CPU and memory usage as they are hit repeatedly.

Fortunately, it is very simple to adapt your site’s robot.txt file to make sure Google and other search engines are only the crawling pages you want. You can these lines of code into the site robots.txt file, specifically to address the add-to-cart links:

User-agent: *
Disallow: /*add-to-cart=*

When you add these lines to the robot.txt file the file is not saying that any search engine that hits the site cannot index your add-to-cart links.

We also recommend that you adapt your robots.txt file to disallow indexing of the cart, checkout and my-account pages, which can be done by adding the lines below, to the same file.

Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Avatar for AJ Morris

About the Author: AJ Morris

AJ Morris is the Product Innovation and Marketing Manager at iThemes. He’s been involved in the WordPress community for over a decade focusing on building, designing and launching WordPress websites and businesses.

Latest Articles

How to Use React Spring

Read Article

Accessing Man Pages on Ubuntu 16.04 LTS

Read Article

TLS vs SSL: A Comparison

Read Article

Premium Business Email Pricing FAQ

Read Article

Microsoft Exchange Server Security Update

Read Article