4XX-Status Code Definition For Webpages

A 4XX status code is returned when there is a client error, meaning that the browser’s request is unable to be understood & responded to by the server. 

Typically these errors are considered “dead ends” as the user is directed to an error page instead of their intended destination.

These errors can be particularly frustrating to end users, and backup plans for re-direction should be in place for anytime a user may encounter one of these situations.

The 404 & 403 error are the most common; 404 being a “Not Found” error, when the URL has no associated page & 403 being “Forbidden”, when a user does not have the appropriate credentials to access the desired URL.

Other client error code types include:

400 – “Bad Request”

401 – “Unauthorized”

402 – “Payment Required”

405 “Method Not Allowed”

406 – “Not Acceptable”

407 – “Proxy Authentication Required”

408 – “Request Timeout”

409 – “Conflict”

410 – “Gone”

411 – “Length Required”

412 – “Precondition Failed”

413 – “Request Entity Too Large”

414 – “Request URL Too Long”

415 – “Unsupported Media Type”

416 – “Request Range Not Satisfiable”

417 – “Expectation Failed” 

404 Re-Directs & 4XX Errors Are Bad For SEO & UX

404 Errors & all other 4XX Errors are like dead ends for users when they are on your site.

Consider your site like a highway, where each link is like an on/off ramp or exit along the highway.

Pages with 4XX errors are essentially closed exits where the user is led off of the highway but has to stop as the road itself stops, then back up to the highway to look for the next closest exit.

4XX errors that are encountered in site crawls should either be updated in your source code, or be updated to be new pages with 301 re-directs that point the users & link equity to the proper destination URL.

This also applies to backlinks that are pointed at URLs on your site that are misspelled etc…

Implementing a new URL that matches the link with a 301 re-direct in it directing the users & link equity to the proper destination page can help improve rankings, performance & user experience.

Another issue occurs when a bot is trying to index your site & encounters empty pages.

This wastes crawl budget as bots tend to only crawl a set number of your site’s pages per day.