How to Deal with Layered Navigation in Magento

There are lots of different ways to prevent layered navigation in Magento from causing SEO issues, varying in difficulty, effectiveness and the amount of traffic the pages can generate.

Why is Layered Navigation an Issue for SEO?

Layered or faceted navigation is known for causing issues from an SEO point of view, as new URLs are issued when filters are applied, which will usually replicate the content featured on the original category page. In addition to this, having thousands of pages without content accessible to search engines will leave a website susceptible to Google’s Panda update.

If you’re struggling with other duplicate content issues with Magento, I also wrote this post, which is focused on technical SEO and Magento.

Eliminating the Issue:

1. Meta or X Robots Rules

My preferred option for dealing with layered navigation pages is to apply meta or x robots rules (noindex, follow – as this tells search engines not to index the page but to continue crawling the website), as this is simple and doesn’t require too much development time.

Applying these rules also helps to meet the requirements for removal requests in Google Webmaster Tools.

2. AJAX Navigation

AJAX navigation is great for SEO’s in this scenario, as the filters that are being applied change the products being shown, but a new URL isn’t generated. This is also a great option from a UX perspective as it’s much faster.

The main issue with this tactic is that it’s likely to require quite a lot of development time/resource in comparison to some of the other options.

3. Canonical Tag

The canonical tag, in theory, is the obvious choice, as it ultimately shows Google that the filter URLs are variations of the original category or sub-category page. However, in my experience, the canonical tag is a bit too flakey and is often overlooked when a page has internal or external links.

That said, I would still use the canonical tag when using the other methods, as it has a very small overhead in terms of time.

4. Parameter handling in GWT

The parameter handling resource in Google Webmaster Tools is another less resource-intensive option, however like with the canonical tag I’ve seen varied results.

A lot of SEOs are raving about this option recently, however I’d still recommend using it alongside meta or x robots rules.

When you’re using parameter handling, it’s important to ensure that you configure it correctly, as you’re ultimately telling Google how to deal with the pages. You can find out about more about using parameter handling in Webmaster Tools here.

Other Interesting Things That I Wouldn’t Recommend:

1. Hashtags in URLs

In the past, search engines have seen parameters featured after hashtags as something that the webmaster doesn’t want indexed and they have been used in scenarios like affiliate URLs and layered navigation. However, these days Google (and other search engines) don’t treat URLs with hashtags in like this and they’re crawled and indexed like any other URL.

I wouldn’t recommend this as it won’t make any different and will simply result in more duplicate variations.

2. Not serving layered navigation options to search engines

So, I recently watched a video from a UK-based SEO agency that suggested that webmasters should use their module to serve layered navigation to users but not to search engines. They were basically saying that because crawlers wouldn’t accept cookie policies, they wouldn’t serve this content to users who didn’t accept cookie policies.

This is not a good idea, as not all eCommerce websites have an ‘accept cookies’ option and also you’re still cloaking effectively.

3. Nofollowing internal links

Although I’d recommend doing this in addition to other tactics, this will not prevent the pages from being indexed, as search engines are very good at finding these types of pages and any external links will result in them being indexed.

4. Rewriting the URLs so they appear search-friendly

This is the one that winds me up the most (and is really common)! Many webmasters choose to rewrite query string parameters based on the name of the page, so watches?colour=black would be watches-black/ for example. This is not solving the problem as you’ve still got thousands of URLs that feature non-unique content and are still leaving yourselves open to the Panda update.