Using Google Search Console to Find Errors on Your Site

 

Most webmasters know that errors can kill your conversion rates. You should find errors before your users do, and one way you can monitor your site is using Google Search Console. Formerly known as Webmaster Tools, Search Console is a way to see your site through the eyes of a crawler. It’s not a way to find all errors, but you can find common issues that you would otherwise never know are there. Using the dashboard, you can intercept possible traffic issues and be proactive with page errors.

Activating Your Account on Search Console

Before you can get stats on your site, you need to sign up on Search Console. Your Search Console account is tied to your Google account, and you can have as many registered sites as you need.

To register your site, you need to verify that you are the owner. The easiest way to do this is to take the confirmation ID that Search Console gives you and apply it to a meta tag or upload the HTML file to the root of your site directory. Either one of these options work, and Search Console walks you through each option.

After you sign up, it takes a few days before you see any data. After you wait, you’ll see several sections in your account where you can view errors.

Messages

After you log in to your account, you should see any messages logged on your account. Google will send you important messages regarding your site, and it’s especially important to check these messages for any warnings or bans. If you have manual actions against your site, Google might filter your site out of search results. You won’t know that you have actions against your site until you notice a severe drop in sales and search engine traffic.

If you have your email address attached to your Google Console dashboard, any messages sent to the account are sent to your email as well.

Crawl Errors

Crawl errors are what we’re looking for when we need to know that our site is returning any errors. There are several server errors of importance on this page.

404: These errors mean the bot can’t find a page. This could be a legitimate 404 if you purposely deleted a file. If so, then you have nothing to worry about, but if you didn’t mean to delete a page, this section can give you a heads-up that an important page was accidentally deleted.

503: This error can be a serious issue. It means “Service unavailable” and it can point to serious issues with your site’s performance. It can mean that it’s time to upgrade your hardware, get a new web host, or even that your site is under a DDoS attack. You should check your site’s health should you receive several 503 errors.

500: A “general error” can mean almost anything, but for the most part it means that you have a coding error. If this error shows for bots but not your human errors, it can remain silent for months. It means something is wrong with code that uses session variables, cookies, or any conditional statements that throw an error when they are unable to default to a specific action. Some people experience this error with custom reports when the code detects the user’s information and doesn’t account for bots.

DNS Errors: Most people experience this error every once in a while. When this error occurs, the bots are unable to find your DNS information and can’t craw your site. If it happens rarely, this error can be ignored, but if you have this error frequently, it’s time to take a look at your DNS settings.

Robots.txt Errors: The robots.txt file is a directive for search engines. It tells them the pages you want them the crawl and blocks the ones you don’t want crawled. If the file doesn’t exist or search engines can’t reach it, then you leave your site open to crawl errors. It can also signal that you have a server configuration error. Some search engines will assume that you want all pages crawled if no robots.txt file is found. Others will crawl nothing as a precautionary measure, so it’s good to make sure you have this file available.

Index Status

The final report to check is index status. This tells you the number of URLs indexed in Google. Google doesn’t usually index all pages on your site, but you want to ensure that any major pages can be found in search. The index status page gives you clues that Google is having a hard time crawling your site when you have several good URLs and index status continue to drop. You can use this report as a preliminary way to detect more serious errors.

Cleaning Up Your Site

Cleaning up errors can make a huge difference in your search engine rank as well as your conversion rates. Even if you have thousands of errors, some fixes will fix hundreds of errors at a time. It’s a tedious task that can take several weeks to get through all of your errors, but it’s necessary to avoid losing customers over broken pages.

Bonus Tip: One sure fire way to kill your website’s search engine rank, is through downtime. You can plan ahead for growth – see our tips on how to test your website’s scalability.

Get the latest in Web Performance
Subscribe now to receive the latest news, tips and tricks in web performance!
The information you provide will be used in accordance with the terms of our privacy policy

No comments

Be the first to post a comment.

Post a comment