Practical Ecommerce

SEO How-to, Part 2: Understanding Search Engines

Editor’s note: This post continues our weekly primer in SEO, touching on all of the foundational aspects. In the end, you’ll be able to practice SEO more confidently and converse about its challenges and opportunities. 

In “SEO How-to, Part 1: Why Do You Need It?,” I addressed the importance of search engine optimization for ecommerce business. In this post, the second installment of my “SEO How-to” series, I’ll explain the basic workings of search engines. Before you can optimize your site to earn more natural search traffic and drive additional sales, you should understand search engines’ underlying principles.

Search engines are a gating technology. When shoppers want to buy something, nearly half of them turn to Google.

Your products might be the highest quality, lowest price, or most trendy. There might be a dozen ways in which your ecommerce business is superior. But if your site is not accompanied by elements that search engines, such as Google, can recognize, the search engines will not drive shoppers to your site. Unless you rank on the first page of search results, your SEO-savvy competitors are likely selling to those shoppers, not you.

How Search Engines Work

Search engines are complex pieces of software supported by vast networks of datacenters. Search engines do two things well: (a) crawl websites and (b) index information for quick retrieval and return results based on algorithms.

The Internet contains tens of trillions of pages. When indexed by search engines, each of those pages has been identified and stored with the help of a crawler or bot. These crawlers examine the code on a web page for relevant information, culling the pages that that web page links to, and sending the information home, to be stored for later use.

Search engines are complex pieces of software supported by vast networks of datacenters.

In the 1990s, bots were simple. They were programmed to identify only plain HTML text and links. Simple bots were sufficient because the web was simple.

As the web has become more sophisticated, the crawlers that analyze websites for relevant information and connections to other pieces of information have also become more sophisticated. Inevitably there’s a delay in bots’ ability to index information as technology leaps ahead, however, because crawlers have to be programmed to find information in new ways.

For example, Google can crawl parts of JavaScript. However, elements that require user input before displaying information are still unlikely to be crawled and may in some cases even be inaccessible to search engine bots.

A page will not rank if a bot cannot or does not index it. If a search engine cannot capture a specific URL for a page that contains the information that answers a searcher’s request, the page will not rank.

The information a bot identifies is stored in a datacenter for analysis and retrieval. Some of a search engine’s secrets go into its bots’ ability to identify information, but much of the “magic” resides in the algorithms that determine how a search engine ranks that information in its search results.

Like a bot, an algorithm is just a complex piece of software. It’s so complex that even the individuals who develop it struggle to comprehend all of it. Thus, it’s seemingly impossible for marketers to know what makes a search engine tick. What we know about search engines’ algorithms derive from the patents they file, the news they release, and the performance evidence as we work with the websites striving to rank.

We do know that hundreds of factors combine to determine each page of search engine results. Some of the signals include the quality of websites that link to a page, where a web server is hosted, the speed at which a page loads, whether a page is easy to use on mobile devices, how relevant the content on the page is to the searcher’s query, and the results that the searcher clicked on previously.

All of the signals combine to form three areas of focus: relevance, authority, and technical. As a result, the discipline of search engine optimization also focuses on those three areas.

Working with Search Engines to Drive Shoppers

Marketers tend to grow wary at this point in the conversation. They may be eager to get started maximizing a site’s natural search performance, but they may also be concerned that doing so means addressing the needs of bots instead of shoppers.

But rest easy. In modern SEO, focusing solely on bots would be detrimental to SEO performance. Search engines exist to fulfill searchers’ requests — not bots’.

Searchers are people, and people don’t like outdated, over-optimized digital experiences. When searchers land on a website that looks like it was optimized in 2005 with lists of keywords, they leave.

In modern SEO, focusing solely on bots would be detrimental to your SEO performance.

But when searchers land on a site that they enjoy and that rewards them with the products and information they need, they stay. Search engines learn from that data and rank the sites with the positive experiences and lower bounce rates.

Thus, SEO walks a thin line between the type of digital experience that shoppers prefer, and the type that bots need to access and index, for relevance and authority.

It’s a difficult line to walk. A modern ecommerce business has many competing priorities and compromises. The trick, for SEO, is knowing which compromises will strengthen performance without hurting the experience of shoppers and the needs of the business.

Read the next installment of our “SEO How-to” series: “Part 3: Staffing and Planning.”

Jill Kocher Brown

Jill Kocher Brown

Bio   •   RSS Feed


email-news-env

Sign up for our email newsletter