Overview

In this lesson, we’ll cover the Response Codes tab in Screaming Frog. We believe that this is one of the most useful areas. Here, we can quickly identify problem areas on our website in an efficient way.

On the Response Codes tab, we can filter based on a few different response codes.

  • Blocked by Robots.txt: Tells us which files are being blocked by instructions in robots.txt
  • No Response: Server did not respond
  • Success (2xx): Page responded just fine
  • Redirection (3xx): A page is being redirected using a 301 (permanent) or 302 (temporary)
  • Client error (4xx)
  • Server error (5xx)

Usage Tips

Find Broken Links

First, the Response Codes report is great for finding internal broken links.

How To Find Them

By reviewing the No Response, Client Error, and Server Error filters with a search filter (ourdomain.com) applied, we can quickly find areas where we link to internal content which for whatever reason is not loading correctly. If we remove the search filter (ourdomain.com), then we can also search for broken external resources to which we are linking.

Find Internal Redirects

Next, the Response Codes report is great for finding internal redirects. That is, internal links which end up redirected for some reason (not great for SEO or site efficiency).

How To Find Them

By reviewing the Redirection filter with a search filter (ourdomain.com) applied, we can quickly find areas where we link to internal content which is redirected. If we remove the search filter (ourdomain.com), then we can also search for redirected external resources to which we are linking.

Identify Robots.txt Problems

Lastly, we can use the Blocked by Robots.txt filter to browse the content which is blocking search engine access. This is important as we don’t want to block valuable content or design rendering content (CSS files, images, content which we want crawled, etc.)

How To Find Them

By reviewing the Blocked by Robots.txt filter with a search filter (ourdomain.com) applied, we can quickly find areas where we are blocking internal content via instructions in our robots.txt. Make sure that any valuable public facing content, images, css files, js files, etc. are not included in here!