A Day in the Life of an SEO

Search engine optimizers are like black sheep in a family. Our work is often misunderstood. Colleagues wonder what we are doing or how we make money for the company. Why do we read blog posts all day, they sometimes ask?

Success in organic search can be elusive. It’s often hard to measure. Other digital marketing channels produce more data for tracking and forecasting. With search engine optimization, however, we frequently guess. We run tests. We experiment.

I’ve been an SEO practitioner since the late 1990s. My agency now manages SEO for large and small companies worldwide. My day generally consists of the four activities below.

SEO: A Day in the Life

Learn via self-study. SEO was born in forums and blogs. Before SEO was an acronym, we were sharing and debating tactics. The forum at was my textbook in 1996 when I began trying to improve my employer’s organic search performance. This was before Google, incidentally.

We used avatars and user names to hide our identities. But we were part of a community. We shared what we learned and argued theories.

Google launched in September 1998. A user named “GoogleGuy” — who, we found out later, was Matt Cutts of Google — would pop in and gently guide our suspicions.

Fast forward to 2020, and forums aren’t nearly as popular. But the SEO community lives on — in blogs, Facebook Groups, Twitter, and the like. Self-study is part of the job. We spend hours reading and discussing with other optimizers.

Screenshot of the home page for's forum, from 2003

The forum at was an early day repository of SEO info. This screenshot from Wayback Machine is circa 2003.

Create experiments. A downside to SEO is that we’re often speculating. Google shares plenty about its algorithm. But it keeps many more details secret.

This gap — the known versus the unknown — is a big part of my job. That’s why we run tests to determine the tactics and strategies that work best for a client or industry. And, yes, search algorithms apply different ranking factors based on the industry.

Testing can prevent catastrophes. For example, over a decade ago I worked for GSI Commerce (now eBay Enterprise), a leading ecommerce platform. We had hundreds of clients making millions of dollars each day. Much of their traffic came from organic search. An SEO mistake would have been dire.

The common belief at the time was that contextual URLs were important for higher rankings. Rather than numbers and symbols, a URL should include meaningful, descriptive text — or so we thought. But updating every client’s URLs (to text, from numbers and symbols) would have been hugely expensive and, potentially, disruptive. So we tested the change on a few brands that volunteered.

We found that changing the URLs was actually counterproductive. Organic rankings did not increase. Our participating clients lost loads of sales from the experiment.

In other words, testing is critical for a meaningful SEO program. Absent tests, implementing “best practices” can be disastrous.

Monitor results. For SEO, data create a hypothesis. That hypothesis is tested. The results of that test drive implementation. But the results also drive the next test. At Greenlane (my agency), we don’t build long roadmaps for clients. Instead, our activities are more like sprints. Each month we optimize based on the incoming data. We abandon ship if a campaign is not driving the expected results.

Conversions (revenue, typically) from organic search traffic are our guiding light. It’s not what SEO tools or Google Search Console report. It’s what’s occurring on the ground with our clients. Thus responsiveness is key to SEO success.

To be sure, we’ll use tools such as SEMrush and Ahrefs for competitive intelligence. We’ll use rank-tracking software to estimate our organic visibility. But conversions and revenue are what matters.

Recommendations. We recommend tactics to our clients. They decide whether to take the advice. Our recommendations typically answer:

  • What is the problem?
  • Why is it a problem?
  • How to fix?

Clients sometimes do not implement our recommendations. The reasons, which vary, include no budget or limited internal staff. When that occurs, we usually run out of ideas.

For example, I once worked with a large, prominent retailer that had suffered declining revenue from organic search. I made a few recommendations based on a careful assessment. The recommendations came with development costs. The chief marketing officer declined my suggestions as I could not guarantee success. The result is that the company’s organic traffic has not improved, and, importantly, its long-term potential is limited.

In other words, the best we can do is provide the what, why, and how.

Bill Sebald
Bill Sebald
Bio   •   RSS Feed