When complex site technology blocks search engines’ crawl paths, it’s also blocking natural search revenue. But there are ways to make sure your site welcomes search engines rather than locking them out.
This follow-on article addresses some of the solutions to compensate for bots’ limitations — to help drive more natural search traffic and revenue. Many ecommerce sites count on natural search traffic. Thus ensuring that the bots can access the content and URLs on your site should be a critical concern.
To be sure, halting innovation in ecommerce is not an option. Instead, discuss these workarounds with your developers so that bots can index your innovative site.
Anchors and HREFs
If you want to be certain, right click on the link and select “inspect.” If you don’t see an anchor tag with an href and an actual URL wrapped around the link text, it isn’t a crawlable link. If you don’t have an option to inspect, you might need to enable developer tools in the settings in your browser or try a free plug-in such as Firebug.
To rank your site, search engines must crawl links to pages on your site. No crawl means no indexation, which in turn means no ranking, no natural search-referred traffic, and no revenue from what could be your largest channel. Focus first on the “crawl” piece of the equation. For search engine optimization, nothing else matters unless the bots can crawl your pages and index them.
Crawlable with pushState()
If the page that is being linked to isn’t even a “page” to a search engine, it won’t crawl the link. Many ecommerce sites use AJAX to load increasingly specific product sets for each filter combination. It’s a compelling user experience, but one that can keep search engines from indexing pages of products that consumers want to buy.
For example, someone searching Google for a black dress won’t likely find one on The Gap because black dresses are not crawlable as a distinct page of content. Macy’s, however, does have a crawlable black dress page.
One easy way to tell if a page is generated with AJAX is to look for a hashtag. Google has stated that it will not crawl and index URLs with hashtags in them.
Faster page loads mean higher conversion rates. To deliver faceted content more quickly, many ecommerce sites have switched to client-side rendering techniques that limit the number of trips back and forth to the server to load a page of content. But client-side rendering can slow indexation by months for an ecommerce site, as described in last week’s article.
That delay can hurt revenue. Ensure that search engines can index all of your content, and in a faster timeframe, by “prerendering” client-side content.
Prerendering is especially critical when a site uses a framework such as Angular or React. Yes, Google is behind Angular’s development. But that doesn’t mean that Google can efficiently index Angular sites — quite the opposite in my experience.