Site health

***This is a free lesson from the SEO Knowledge Base. Gain access to everything in this knowledge base, replays of 30-minute SEO Nugget & Nudge classes and monthly SEO coaching for $9 in the SEO Group Coaching Membership.

You can use Google Search Console to check your site health. Learn how to check your website's health with 5 sections from the main navigation Performance, Indexing, Experience, Enhancements, Security & Manual Actions.

In addition, find the Crawl Stats section hidden in Settings to see how many URLs on your website Googlebot crawls per day.

Login to Google search console > Select your website

1. Performance > Search Results

Use the the search results report to understand the overall traffic trend of the website.

What to do:

Adjust the date range to the last 16 months (maximum) to get a bird's eye view of how Google traffic to the website has changed over time. Analyze the traffic changes in context. Check if the traffic change happens every year due to a seasonality trend, holiday or suspected Google algorithm update hit. This will help you strategize what to do next for the website.

You might notice traffic changes due to seasonality (e.g, a shop that sells summer houses would have less traffic from Autumn). In that case, such traffic drops are completely normal.

Make notes of any traffic movements that are out of the ordinary and the dates when it happened.

You might notice traffic changes due to Google algorithm updates. Cross reference Google's algorithm updates timeline with the dates when website traffic started to change.

2. Indexing > Pages

The Indexing Pages report tell you how many pages on your site are indexed on Google over time.

What to look for:

You should see the no. of Indexed Pages remain the same or increase if your site grows and you publish more pages. This is good:

But if you notice that there has been a decline in no. of indexed pages over time, for example:

That means there are gradually less pages indexed on Google. This can cause a decline in organic impressions for your website. That means your website is being seen less on Google now which can result in traffic drops.

To find out why pages are not indexed, scroll down to the table report under the chart.

3. Page Experience

You can find out if a website provides a good experience with this report. The page experience signals Google considers for rankings are: Core web vitals, Mobile Usability and HTTPS.

What to look for:

Ideally, all signals in the report should show Good.

Page experience affects rankings. Google wants to find the most helpful information with a good experience to serve users first. So if there are similar pages of helpful content, the page with a better page experience will rank higher.

4. Enhancements

The Enhancements report tells what type of rich results your website is showing up for. You'll also find any errors about your schema markup implementation here.

What to look for:

Ideally, your invalid error count should be 0. Errors prevent the rich result from showing up on Google.

Warnings are not too bad, it does not disqualify your rich result from showing, it might just reduce the amount of information shown.

5. Security & Manual Actions

This is an important report to check for peace of mind. The Manual Actions report tell you if Google found any manipulative, spammy tactics on your site. The Security Issues report tell you if your site had been hacked. Source

What to look for:

No issues detected in both reports.


6. Crawl Stats

This report helps you understand how Google currently crawls your site. It is not part of the main navigation and you need to go to Settings.

Settings > Crawling > Crawl stats > Open Report

What to do:

One of the most basic checks you can do is check how many pages on your website does Googlebot visit per day. Crawl requests are made per URL.

Just hover over the chart and you'll find the total crawl requests per day. You can also estimate the average by dividing the total crawl requests / 90 days.

If the number of pages Googlebot crawls is much lower than the number of pages on your website, that means Googlebot is not crawling majority of your website pages. This is not a good sign. It could mean:

  1. Your web hosting server is not able to handle Googlebot crawling requests.
  2. Googlebot has lost interest in your website, due to low quality issues or technical issues and the crawl demand for your site is low.

Complete and Continue  
Discussion

0 comments