Breaking

Sunday, April 23, 2017

STRUCTURED NAVIGATION OF WEBSITES


Ecommerce SEO Basics: Structured  Navigation
Structured navigation is a way to customize and filter your data. By this way you can reduce the pain of the customer by directly showing the filtered results.

The issues with large sites such as ecommerce sites, restaurant, dining sites etc. is that they carry with them thousands of indexable and live urls. Ecommerce sites even carry millions of these urls. Now we will explain how  Google sees this structured data and which factors impact your rankings.
Brief overview of structured navigation
Structured navigation is a way to filter the data on a website. It simply provide you your desired category. From these results you can simply and conveniently narrow down your search. Here is an example:
Filtered view on mobile phones



Because every possible combination of structured navigation is typically (at least one) unique URL, faceted navigation can create a few problems for SEO:
1.   It creates a lot of duplicate content, which is bad for various reasons.
2.   It hampers up valuable crawl budget, increases google time in indexing and can send Google incorrect signals.
3.   It dilutes link quality and sometimes indexes wrong links which hampers website ranking.
It’s worth taking a few minutes and looking at some examples of faceted navigation that are probably hurting SEO. These are simple examples that illustrate how faceted navigation can (and usually does) become an issue.
Below is the illustration that depicts the differentiation between properly applied SEO techniques and without SEO .




In the above websites flipkart, ebay and amazon when somebody searches black dress it depicts hundreds and thousands of searches but in case of ebay it shows the results in lacs/millions(577000). Hence ebay is not properly oriented and it needs SEO optimization to filter and present the data in structured way.
You can go on most large-scale e-commerce websites and find issues with their navigation. The points is, many large websites that use faceted navigation could be doing better for SEO purposes.
Structured navigation solutions
When deciding a structured navigation solution, you must  decide what you want in the index, what can go, and then how you can  make that happen. Let’s take a look at what the options are.
"Noindex, follow"
The first solution that comes to mind would be to use noindex tags. A noindex tag is used for the sole purpose of letting google bots know to not include or exclude a specific page in the index. If we would like to simply exclude the page this is the best method. You can easily remove the duplicate content.
However the crawl budget will be wasted here. Link equity is also lost in this method.
Example: If we wanted to include our page for “black dresses” in the index, but we didn’t want to have “black dresses under $100” in the index, adding a noindex tag to the latter would exclude it. However, bots would still be coming to the page (which wastes crawl budget), and the page(s) would still be receiving link equity (which would be a waste).
Canonicalization
Most of the sites approach this issue by using canonical tags. Using a canonical tag,  Google can  know that in a collection of similar pages, you have a preferred version that should receive credit. This method is best for identifying duplicate content. Link equity is also consolidated to the canonical page However, Google will still be wasting crawl budget on pages.
Example: /black-dresses?under-100/ would have the canonical URL set to /black-dresses/. In this instance, Google would give the canonical page the authority and link equity. Additionally, Google wouldn’t see the “under $100” page as a duplicate of the canonical version.
Disallow via robots.txt
Disallowing sections of the website could be a great solution to remove duplicate content. It’s quick, easy, and customizable. But, it has its shortcomings. Link equity is trapped and unable to move anywhere on your website (even if it’s from an external source). Another shortcoming  is even if you tell Google not to visit a certain page (or section) of your site, Google can still index it.

Example: We could disallow *?under-100* in our robots.txt file. This would tell Google to not visit any page with that parameter. However, if there were any "follow" links pointing to any URL with that parameter in it, Google could still index it.
"Nofollow" internal links to undesirable structures of websites
An option for solving the crawl budget issue is to "nofollow" all internal links to facets that aren’t important for bots to crawl. Unfortunately, "nofollow" tags don’t solve the issue entirely. Duplicate content can still be indexed, and link equity will still get trapped.
Example: If we didn’t want Google to visit any page that had two or more facets indexed, adding a "nofollow" tag to all internal links pointing to those pages would help us get there.
Avoiding the issue altogether
.If you are currently building or rebuilding your navigation or website, you should consider building your structured navigation in a way that limits the URL being changed (this is done with JavaScript). The reason is : it provides the ease of browsing and filtering products, while potentially only generating a single URL. However, this might hamper your website as you will need to manually ensure that you have indexable landing pages for key structured combinations (e.g. black dresses).
Here’s a table outlining what I wrote above in a more digestible way.
Options:
Solves duplicate content?
Solves crawl budget?
Recycles link equity?
Passes equity from external links?
Allows internal link equity flow?
Other notes
“Noindex, follow”
Yes
No
No
Yes
Yes
Canonicalization
Yes
No
Yes
Yes
Yes
Can only be used on pages that are similar.
Robots.txt
Yes
Yes
No
No
No
Technically, pages that are blocked in robots.txt can still be indexed.
Nofollow internal links to undesirable facets
No
Yes
No
Yes
No
JavaScript setup
Yes
Yes
Yes
Yes
Yes
Requires more work to set up in most cases.
Source:https://moz.com/blog

But what’s the best way?
The truth is there is no particular method to achieve the feast. You will hae to use the combination of methods to achieve the results. The combination of methods to be used will vary from website to website depending on its structure and how your URLs are structured, etc. You will have to decide between link equity and crawl budget. But you should go with the intermediate of the two.
Consider this: You have a website that has a structured navigation that allows the indexation and discovery of every single facet and facet combination. You aren’t concerned about link equity, but clearly Google is spending valuable time crawling millions of pages that don’t need to be crawled. What we care about in this scenario is crawl budget.
In this specific scenario, I would recommend the following solution.
1.   Category, subcategory, and sub-subcategory pages should remain discoverable and indexable. (e.g. /clothing/, /clothing/womens/, /clothing/womens/dresses/)
2.   For each category page, only allow versions with 1 facet selected to be indexed.
1.   On pages that have one or more facets selected, all facet links become “nofollow” links (e.g. /clothing/womens/dresses?color=black/)
2.   On pages that have two or more facets selected, a “noindex” tag is added as well (e.g. /clothing/womens/dresses?color=black?brand=express?/)
3.   Determine which facets could have an SEO benefit (for example, “color” and “brand”) and whitelist them. Essentially, throw them back in the index for SEO purposes.
4.   Ensure your canonical tags and rel=prev/next tags are setup appropriately.
This solution will (in time) start to solve our issues with unnecessary pages being in the index due to the navigation of the site. Also, notice how in this scenario we used a combination of the possible solutions. We used “nofollow,” “noindex, nofollow,” and proper canonicalization to achieve a more desirable result.
Other things to consider
There are many more variables to consider on this topic — I want to address two that I believe are the most important.
Breadcrumbs (and markup) helps a lot
If you don't have breadcrumbs on each category/subcategory page on your website, you’re doing yourself a disservice. Please go implement them! Furthermore, if you have breadcrumbs on your website but aren’t marking them up with microdata, you’re missing out on a huge win.
The reason why is simple: You have a complicated site navigation, and bots that visit your site might not be reading the hierarchy correctly. By adding accurate breadcrumbs (and marking them up), we’re effectively telling Google, “Hey, I know this navigation is confusing, but please consider crawling our site in this manner.”
Enforcing a URL order for facet combinations
In extreme situations, you can come across a site that has a unique URL for every facet combination. For example, if you are on a laptop page and choose “red” and “SSD” (in that order) from the filters, the URL could be /laptops?color=red?SSD/. Now imagine if you chose the filters in the opposite order (first “SSD” then “red”) and the URL that’s generated is /laptops?SSD?color=red/.
This is really bad because it exponentially increases the amount of URLs you have. Avoid this by enforcing a specific order for URLs!
Conclusions
To summarize, here are the main takeaways

  •      Structured navigation can be great for users, but is usually setup in a way that negatively impacts SEO.
  •       There are many reasons why faceted navigation can negatively impact SEO, but the top three are:
  •       Duplicate content
  •      Crawl budget being wasted
  •        Link equity not being used as effectively as it should be
  •       Boiled down further, the question we want to answer to begin approaching a solution is, “What are the ways we can control what Google crawls and indexes?”
  •       When it comes to a solution, there is no “one-size-fits-all” solution. There are numerous fixes (and combinations) that can be used. Most commonly:
  •      Noindex, follow
  •       Canonicalization
  •      Robots.txt
  •     Nofollow internal links to undesirable facets
  •      Avoiding the problem with an AJAX/JavaScript solution
  •          When trying to think of an ideal solution, the most important question you can ask yourself is, “What’s more important to our website: link equity, or crawl budget?” This can help focus your possible solutions.



No comments:

Post a Comment

Follow by Email