If you’re about to dive into a comprehensive technical SEO audit for a new campaign, or with a new agency, what I’m about to tell you could save you tens of thousands of dollars.
First, think about the last time you wasted a huge chunk of time or money on an investment that didn’t pan out.
Maybe it sounded like a great idea at first.
Somewhere along the line, between planning, ballooning budgets, unfocused execution, or follow-through, things went off the rails.
When all was said and done, you were out a lot of money, with nothing to show for your hard work.
That is exactly what your next Technical Audit might look like.
Dozens of hours, tens of thousands of dollars. Poof. Gone.
The truth is most technical SEO audits return a laundry list of time-consuming, tail-chasing recommendations, and they result in little to no rank gain.
Now here comes the good news: there’s a better way.
Don’t get me wrong, some Technical SEO needs to be done near-perfectly. But to win Technical SEO in 2022, you only need to nail 7 simple categories. You can ignore almost everything else.
Technical SEO follows the Pareto Principle: 80% of the results are driven by only 20% of the process.
Instead of some crazy forensic technical SEO audit, I’m going to explain the 80-20 of technical SEO.
This new-school Technical SEO audit is so consistent, and saves so much time, we completely ripped out our old process and built a remarkable new tool just for this job. Today it’s the cornerstone of all of our growth campaigns.
We’ll show you exactly how we do this for our own clients, so you can do it too.
Let’s get started.
Not long ago, our agency moved from a traditional technical SEO audit to a more practical, cost-effective, and goal-oriented approach.
A traditional, technical SEO audit should really only be for enterprises that have large websites (100k+ pages), a custom technology stack, and limitless time and budget.
You have bigger priorities.
We found that the vast majority of our clients didn’t have the time or the budget to waste on a 1,000 point checklist that they really didn’t need. Once we realized this, we could never turn back. We had to change our process to solve the problems we saw:
Most Technical SEO is a wormhole of diminishing returns. Countless businesses have run this race to nowhere, making infinitely smaller improvements to their websites with no end in sight.
The truth is, most of these “improvements” do absolutely nothing for your bottom line.
We tossed the old way out the window. Our solution was a custom report we call the Website Quality Audit. What makes it so special?
In a nutshell, the Website Quality Audit is a technical audit wrapped up with a content audit.
It uncovers the technical bits that really matter – HTTP Status actions, 404, traffic, links, etc., – and it pushes those findings into an actionable report.
But that’s not all: we’ve also paired it with a content analysis, keyword tracking, and competitive analysis. The result is a single report that tells you what to do, for every facet of your website, in order to rank for your target keywords.
To do this, we blend data from multiple sources and tools in one spreadsheet. By looking at this custom-curated data side-by-side, the exact actions to take to optimize your website become clear.
The WQA allows us to do 20 hours’ worth of SEO work in less than 4.
That means more time and budget to spend where it matters…
You won’t be able to brag about spending 6 months fixing 2,000 technical items on your website. With the WQA, you get an audit that only addresses the most important technical issues, the ones that move the needle.
|WQA||Traditional technical SEO audit|
|Focus||✅ Fix What Moves the Needle||❌ Unlimited and Comprehensive|
|Scope||✅ 80-20 Analysis of Important Updates||❌ Hundreds or Thousands of Errors Identified, no clear plan of action|
|Time Estimate||✅ 20 hours||❌ 200 hours|
|Output||✅ List of actions to take||❌ List of Problems Identified|
|ROI||✅ Positive||❌ Negative|
Here’s exactly what you need to check (and nothing more) in your Technical Audits for 2022.
This is the exact list of technical checks we run for our own clients, and what we do about them.
HTTP Status codes tell your visitor’s computer (or Google’s crawlbot) the status of a resource. Is the page available? Has it moved somewhere else?
The 4 HTTP Codes to Check for SEO: 200, 301, 302, and 404.
200 is A-OK. All systems go, this worked perfectly. The other codes mean something is unusual, but not necessarily wrong.
The Website Quality Audit checks each page’s status code and spits out an action that will tell you if you need to do something about it.
If you’re checking the codes yourself, use this table to assign tasks to get those actions done:
|302||301 redirect (302s should only be used for temporary redirects) 🟠|
|404 w/ external links||Add a 301 Redirect to the appropriate page. Fill this in in the Final URL column 🔴|
|404 w/ external links + internal links||301 Redirect to the appropriate page, remove internal links 🟠|
|301 and redirect_chain = TRUE||Assign 301 Redirect to fix redirect chain 🔴|
|301 and redirect_chain = FALSE||Leave As Is 🔵|
|301 w/ redirect is indexable = FALSE||Assign 301 Redirect to fix redirect to Indexable URL (add in Final URL column) 🟠|
|404 w/ broken internal link||Remove Internal Links from found_at_url 🟠|
Sitemaps have existed since Web 1.0, and they’re still critical to the functionality of your website, for both users and for SEO. On any given website, they normally are found at /sitemap, /site-map.xml, or /sitemap.xml
In 2022, sitemaps are a huge factor in how Google determines which pages are “important” and which are not. They add context for the computer, basically serving as a notice to Google from you, stating: “these are the pages I want to be found and indexed on my website.”
The problem with most sitemaps is that they don’t do this well. They show way too much, and not all of it is helpful.
Worse still, some sites don’t have a sitemap at all, or don’t provide one to Google Search Console.
Check your website sitemap (often located at /site-map.xml, sitemap.xml, or /sitemap). If you don’t have one at all, you’ll need to get one. WordPress plugins are your friend.
Now, you need to know if that sitemap is showing off the proper web pages.
Does it have thin content, or old URLs, or no-indexed pages? These are pages that need to be removed from your sitemap.
You also want to check to confirm that important pages aren’t missing.
We check this with Ahrefs. The Website Audit report will tell you if a no-indexed page is included in a sitemap, or vice-versa if a page is indexed with traffic, but missing from the sitemap.
|Page has a non-200 HTTP status OR is noindex’d OR has < 200 word count…
AND is found in sitemap
|Remove From Sitemap 🔵|
|Page receives organic traffic AND is not noindex’d AND has a http_status_code of 200, but is not found in crawl or sitemap||Add to Sitemap 🟠|
Canonical tags are a snippet of code rel=”canonical” that tells Google which copy of the page is the “master” version. It’s another way of communicating which pages are important, and which aren’t.
Don’t bother digging through all the different scenarios where canonicalization comes up. Here’s the gist: canonicals point at a URL and tell the crawlbot:
“Here is the version of this page you should check.”
Sometimes you want that URL to be exactly the same as the URL that’s typed into the browser, but sometimes you do not.
In general, if a page is important to your visitor and has mostly unique content, you want that page to be the canonical version.
For more specific, actionable information, use the flow chart below:
|Page is missing a canonical url||Tag “Canonicalize” and add the correct canonical URL in Final URL column of the WQA 🔵|
|A paginated page has a canonical URL that is not itself||Tag “Canonicalize” and add self-canonical URL in Final URL column of the WQA 🔵|
|A URL’s protocol doesn’t match with the canonical url’s protocol||Tag Canonicalize and add canonicalized to HTTP protocol, update to HTTPS in Final URL column 🟠|
|A page has a canonical URL (including self-canonicalization)||Leave As Is 🔵|
Google uses a program (a “crawlbot”, or a spider) to search the web for pages. When it finds your page, some code from your page (a “robots” file or tag) will either tell that program “This is a useful page, you should check it out” or “No, don’t check this page, move onto something else.”
Indexable, crawlable pages are shown in Search Results. If pages aren’t indexable or blocked from the crawlbot, they (normally) won’t be shown at all.
For every page on your website, ask yourself: “Do I want people to find this in Search Engines?” Your important pages should be easy-to-find and indexable. Unimportant pages should no-indexed or blocked so Google won’t waste time crawling them, and will not show them in Search Engine Results Pages (SERP.)
You can check this en-mass with a tool like Screaming Frog and/or Ahrefs.
|Page is not found in the crawl but gets organic traffic||Add Internal Links so that the page can be found and crawled 🟠|
|A subdomain or subdirectory with content that is not relevant for searchers is being indexed||Block Crawl via Robots.txt (after manually reviewing that the content is not relevant for search) 🔴|
Internal linking refers to the way you add hyperlinks to your own webpages.
Google counts how many links within your website are pointing to individual pages. It considers the most linked pages more important than pages with few or no internal links.
It also creates relationships between pages with common internal links, in “clusters”, as a way of approximating quality: if topics are linked together well, search engines guess that they must be in-depth and of higher quality.
Do all the important pages on your site have internal links from somewhere? Do the number of internal links align with how important that page is? Do I link between related articles to help readers and search engines find related content?
|Page generates traffic
But was not crawled
OR found to have 0 internal inlinks in the crawl
|Add Internal Links (this is an Orphaned page that needs link equity) 🔴|
|Page has high (top 25%) internal inlinks,
AND low (bottom 25%) traffic or referring domains
|It’s possible that page should not be featured so prominently in navigation/content.
Reduce internal inlinks, unless the page has an alternative purpose (critical info, legal necessity, etc) 🔵
You might sense a theme by now, most of these Technical SEO best practices boil down to communicating with search engines.
Website schema is one of the best ways to do this, but it’s terribly misunderstood. In 2022 and beyond, schema will be increasingly important.
By adding schema markup to content on your page, you help search engines find and present info to users through engaging rich results.
Schema markup is code that translates your human-readable content into a language that search engines can understand.
Search engines love schema, because they use this code to generate useful and engaging content on SERPs.
Key takeaway: be the teacher’s pet. Google is asking you to go the extra mile and add Schema markup to your websites because it makes their job easier. Listen to them.
The websites that “play ball” with Schema reap huge competitive advantages: Featured Placements, Knowledge Boxes, and stand-out features like reviews.
Schema scares a lot of people, and it can be confusing. Fortunately, the basics are simple, and there are plenty of tools to help you understand schema in plain English.
It’s important to remember that the Rich Results testing tool only validates schema.org types that are eligible for rich results in search.
With the RRTT, you’ll sometimes get recommendations to improve your Schema in the Additional Resources section. You can also use our agency’s process to fast track this:
It’s just what it sounds like: Google scores your site based on speed, mobile responsive design, and user-friendliness, plus a few other categories. It bundles these together into a single rank factor it calls “Page Experience,” as in: how good (or bad) is the user experience on this webpage?
Page Experience is a bit complex, which is why we filled a whole article for it. Here’s the 2 minute version:
The Page Experience algorithm checks many different factors at once. There are 4 big categories:
Core Web Vitals and Mobile-friendliness are the big problem areas for most websites. Https is table stakes – if your website is still served over http instead of https, you aren’t even in the game. Non-intrusive interstitials means no aggressive pop-ups.
Aim for speed: Page load time goals are less <1.2s. That’s not easy, and most sites perform well enough at 2s or less. Make sure you check this for mobile.
The main culprits for a slow website:
Reduce LCP: Largest Contentful Paint is a Core Web Vital, and somewhat similar to a pagespeed check. It’s the score for: “how long does it take to see most of the important stuff on your page?” Again, check this for mobile.
For more info on Core Web Vitals: read our article on Page Experience.
No Pop-Ups: Don’t use pop-ups, anything that looks like a pop-up, or anything that could be misconstrued as a pop-up. A common offender is on-site chat boxes that open immediately and crowd the screen. Use a small bubble icon instead.
That’s it! If you have run this process and delegated all the fixes you need to make, you’ve just finished your 2022 Technical SEO Audit.
If you’re looking for proof, look no further than our case studies.
We run this exact same process for every single client Webris brings on.
Because this audit saves us so much time, we spend more working on the growth phase of your campaign. You know, the stuff that drives real results.
The results are consistently undeniable.
So, is technical SEO dead? The short answer is no, but the days of long checklists, countless hours of busy work and budget sucking inconsequential fixes are.
Technical SEO fixes are important, but they’re a means to an end. We run these reports to identify issues holding your business growth back.
We know your business doesn’t care about vanity metrics. You don’t want more busy work. You don’t want traffic, and you don’t even really want leads. You want bottom-line revenue, ROI, expansion, and growth.
Webris is a Growth Agency. We skyrocket KPIs with expertly-crafted marketing strategies designed to catapult B2B businesses toward their goals.
Hop on a call with us to gain a birds-eye view of the route we’ll chart to reach your goals.
Find out how much organic traffic your website should be getting through our Traffic Projection Analysis.SHOW ME OUR POTENTIAL!
Using data from your website, our Traffic Projection analysis can accurately forecast how much traffic (and revenue) your website could be getting from Google.FIND OUT MORE