How Accurate Are They?
Since the very beginning of the search engine optimization (SEO) industry, marketers and business owners have looked for shortcuts and automation tools to expedite the massive undertaking of “get Google to give us that first spot.” There are hundreds of “SEO audit” tools available that claim to instantly size up your site and tell you what you’re doing wrong. A common industry sales tactic is to send out automated emails with “audit results” for recipients’ websites in hopes they will see red X’s and ask for help.
Advantages of Automated Audit Tools
- Scan through hundreds of pages in seconds
- Detect missing elements and broken links
- Catch duplicate content that may have been copy/pasted from a different website (a big “no-no” in SEO)
- Strength: Identify simple technical issues
Disadvantages of Automated Audit Tools
- The tools have no idea what keywords you are targeting or which terms are the most valuable
- Pass/fail criteria for the majority of their checks is arbitrarily set by the test creator and may not reflect a Google standard or ranking factor
- Tools treat all pages the same and don’t understand the broader structure of your website
- Weakness: No understanding of strategic components
The importance of keyword strategy
In SEO, everything revolves around the search terms you are going after. The end goal is always to ensure your website appears prominently when a potential customer searches for those terms. When an audit tool looks at your website, it’s either guessing or not even considering which words/phrases are going to drive business. Without understanding your SEO goals, it’s impossible for an automated tool to actually evaluate your SEO campaign.
A properly designed SEO strategy will involve hours of initial keyword research and ongoing tweaking based on analytics data, search patterns, and revenue trends. Automated tools don’t have access to any of that.
Let’s take a few automated tools for a spin!
For the sake of this blog post, we ran mta360.com through a handful of the available audit tools and found many of their suggestions were extremely misleading. Here are some examples:
Remove iFrames
This is a great example of the shortcomings of automated SEO tools. Their lack of understanding the nuances of SEO will often give false readings like this.
- What the tool is referring to: In general, iFrames are not a great way to display content. An iFrame is basically a window inside your webpage that displays the contents of another webpage. For textual content, Google much prefers you include the actual verbiage on your own website and cite the source versus ask the browser to open a second URL within your page.
- Why the tool is incorrect: The test was seeing a YouTube video! When you embed a video on your webpage from YouTube, Vimeo, etc – the code that allows you to do that is an iFrame! A YouTube iFrame is actually the best way to place a video player on your website. Google owns YouTube and instructs creators to share their videos in this way. Embedding a YouTube video with an iFrame will keep your page load time down and can even help Google understand your page better.
- Assessment: Following this suggestion could actually harm your SEO rankings. Obviously Google would never punish you for placing their code on your site!
Reduce length of title tag
Title length has been a long-debated topic in the SEO industry. If you look up at the tab on the top of your browser, you’ll see a truncated version of the title of this page: “Automated SEO Tools and Audits.”
- What the tool is referring to: This particular tool said title tags need to be “between 10 and 70 characters.” Google search results typically includes 65-70 characters of the page’s title tag as a blue link, then the rest gets cut off with an ellipses.
- Why the tool is incorrect: Even though search results only display a limited number of characters, Googlebot actually reads much further than that. A longer title can be both appropriate and advantageous in many situations. Google will automatically choose the most important chunk of text and display it on the search results page.
- Assessment: This suggestion could harm your rankings by removing important terms from a prominent position. Title tags are one of the most important page elements when it comes to SEO.
Shorten meta description
Once again, the automated tool has evaluated part of our homepage’s SEO with arbitrary criteria.
- What the tool is referring to: In search results, those blocks of black text underneath the blue links are typically (but not always) pulled from a special tag called a “meta description.” Web developers place this chunk of code into a hidden portion of your webpages to describe the purpose of each page to bots like Google.
- Why the tool is incorrect: Google’s Webmaster guidelines specifically say “There’s no limit on how long a meta description can be.” The creator of this automated tool simply set an arbitrary limit based on their own preferences.
- Assessment: Rewriting meta descriptions for every page on your website could take hours and would likely have no effect on search performance.
Implement a XML sitemaps file
XML sitemaps are a special file that webmasters and SEOs can place on the server to guide Googlebot toward new pages and make sure the crawler doesn’t miss any.
- What the tool is referring to: Google recommends but does not require sitemaps. For extremely large websites with thousands of pages, it is a best practice to include a dynamic sitemap that will update itself anytime new pages are published.
- Why the tool is incorrect: For small websites where the pages all link between each other, Google has no trouble finding and indexing the content. When new pages are created, it is actually much faster to request indexing within Google Search Console versus waiting for them to re-check your sitemap. Search Console will get the page crawled within a few hours while waiting for a sitemap crawl could delay it by a few weeks.
- Assessment: While XML sitemaps are fine to include, relying on them as the only means of notifying Google about new pages can cause major delays in new content getting indexed. This suggestion wouldn’t produce any tangible benefit.
Increase page text content
Google has long told webmasters to create content for people not rankings. They will reward the best and most relevant webpages with top positioning. Google’s Danny Sullivan recently spoke out about the myth of there being a minimum word count.
- What the tool is referring to: SEOs have long lived by the motto “content is king.” In general, a more thorough page with lots of great written content will outrank a thin page about the same topic.
- Why the tool is incorrect: The automated tool has no understanding of the topic(s) being discussed and can’t actually assess how much content is required to sufficiently cover it. The developer of that tool is simply assigning pass/fail grade based on their own personal preferences.
- Assessment: Adding content is always a good idea if it will serve a purpose and benefit your readers. Increasing your word count solely to stuff more keywords into a page or meet the threshold of an SEO tool is a waste of time.
Remove noindex tags / stop blocking spiders in robots.txt
Webmasters can place a “noindex” tag on any pages they don’t want Google to visit. Alternatively, a separate “robots.txt” file can be used for the same purpose. Both methods simply instruct Google not to crawl those pages.
- What the tool is referring to: If your entire website had an accidental “noindex” tag applied (which is an easy mistake to make when first launching from a demo server), that would prevent Google from crawling and ranking it.
- Why the tool is incorrect: There are plenty of valid reasons why SEOs may choose to block Google from certain pages. Examples include articles that are only available to logged-in users, special offers that are not yet available, or a “thanks for submitting our contact form” page.
- Assessment: Keeping Google from crawling certain pages can be helpful to your rankings and keep them from wasting “crawl budget” on non-SEO pages. Doing so will also prevent users from finding irrelevant pages in Google search results.
Duplicate page titles
It is a best practice to give every page and blog post on your website a unique title. If two pages have the same title, they should be combined because their content is assumed to be redundant.
- What the tool is referring to: Our website has paginated content such as our blog. If you’re looking at a filtered view of a page category of our Blog page, that is technically the same page as the unfiltered view.
- Why the tool is incorrect: When a given page can be viewed in multiple ways, Google is perfectly fine with that and just asks that a “canonical” tag be included to help them understand which view is the default. Our pages are all properly canonicalized.
- Assessment: The tool is failing to see the big picture and simply looking at each URL as a separate page and not understanding the technical elements that Google themselves recommend and read as part of crawling a page.
Summary
Automated SEO audit tools can be helpful, especially when first launching a website. The better tools are able to find broken code, typos, and missing elements much faster than a human reviewer ever could. However, the strategic suggestions and “red flags” are often based on uniformed criteria and should never be blindly implemented.