The risk to organic search performance from altering a website usually comes from changes to content, linking structures, or underlying technology. With careful planning and execution, however, you can mitigate the risk from those changes and increase the likelihood of better rankings.
This is the 11th installment in my “SEO How-to” series. Previous installments are:
- “Part 1: Why Use It?“;
- “Part 2: Understanding Search Engines“;
- “Part 3: Strategy and Planning“;
- “Part 4: Keyword Research Concepts,”
- “Part 5: Analyzing Keyword Data,”
- “Part 6: Optimizing On-page Elements“;
- “Part 7: Mapping Keywords to Content“;
- “Part 8: Architecture and Internal Linking“;
- “Part 9: Diagnosing Crawler Issues;”
- “Part 10: Redesigns, Migrations, URL Changes.”
In “Part 6,” I explained how keyword research could identify potential ranking improvements. But changing content — title tags, body copy, descriptions — based on that research carries a risk. As such, it’s essential to mitigate the risk and improve the chances of a performance increase.
First, identify all the pages that rank for the words and phrases that you’re planning to change. Then create a keyword map by assigning variations on the new keyword theme for each page based on the research.
Avoid radical keyword changes to pages that are already ranking well.
Every page should have a reason for existing — something that no other page says. If it duplicates a keyword theme, perhaps the page isn’t necessary. Consider merging with its equivalent via a 301 redirect.
The level of risk you’re assuming is equal to a page’s traffic from organic search. It’s possible to attempt to optimize a page and lose all the traffic. The reward you’re hoping for is represented by the number of searches per month for the new keywords.
Identifying risk from removing pages is easier. Organic search traffic to those pages will stop. To mitigate, 301 redirect the removed URLs to relevant remaining pages. That will preserve the authority of the deleted pages while strengthening the remaining ones.
Changing Linking Structures
The primary navigation structures across your site — such as in the header and footer — are critical for passing link authority, which helps your pages rank organically. Thus removing links from those navigation structures requires caution bordering on obsession.
It’s difficult to predict how removing a link will affect a page’s performance. It’s best to prepare for the worst case. Identify the traffic and revenue of the destination page. The higher the amounts, the more cautious you must be.
Mitigating that risk can be difficult. It often requires discussing with the staff that seeks the removal — typically user experience, creative, or management teams. Determine why the link needs to change or be removed: Is there a strong business reason? If not, don’t be shy about showing your performance data and keyword research to insist that the link remains.
If the removal is required — after all, sometimes other priorities outweigh organic search — consider ways to overcome.
For example, if user experience data suggests that simpler header navigation could drive a 10-percent increase in sales, perhaps the critical links could be included in a new drop-down menu, thereby simplifying the navigation while preserving the links. Another option is to move the links to the footer.
However, if they must be entirely removed from the header and footer, can the links be inserted in other places, such as related content? You probably wouldn’t save 100 percent of your organic performance, but it should help.
It could also be an excellent time to boost your content marketing efforts to encourage new external links to the pages.
Technical changes can likewise introduce risk to organic search performance. These include everything from switching ecommerce platforms to the everyday decisions of your developers, such as code upgrades. Even design choices such as where to place text and how to manage it can impact organic search performance.
As with content and linking changes, it’s important to understand what the technical change will affect.
Some technology impacts the ability of search bots to find, crawl, and index your content. Platforms and code, as examples, can lock bots out of a site — and thus lock the site out of search results.
Examples of these gating technologies include:
- Links coded without anchor tags, href attributes, and URLs.
- Links that result in error codes such as a 404 file not found and 500 internal server error.
- Accidental disallow commands in the robots.txt file.
After they crawl your site, search engine bots need to index the pages. All of the text must reside in HTML, not as images, audio, or videos.
Describe your images in alt attributes. Offer transcripts of audios and videos. And above all, don’t embed text in images without repeating it in HTML or CSS.
Consider an Audit
For more on uncovering technical SEO glitches, see my instructions for a crawl audit in eight steps and an indexation audit in six.
See “Part 12: Technical Tools.”