How to Respond to Google Thin Content Update
Google Thin Content Update Solutions
Google has updated how it handles thin content and soft 404s. Now pages that are “thin content” in Google’s eyes will show up under the Soft 404s tab in Google Webmaster tools.
What Does This Mean?
A regular 404 is a page that does not exist or no longer exists. It tells search engines it doesn’t exist by using a 404 http status. A Soft 404 is a page that does not return a 404 status code that Google believes it should. Some examples of this would be search results pages that have no results, or a page that was previously on one topic but now 301 redirects to another. Google says it should be a 404 because the 301 redirect does not provide a positive user experience. With the new Google Webmaster Tools update, thin content pages are also being shown in the Soft 404 tab.
Unlike regular 404s, Soft 404s are always a negative ranking signal. Regular 404s can be the result of removing pages, which is totally ok. Soft 404s are always the result of an error on the site, lowering the quality of the site.
How To Find Out If You’re Affected
Open the Google Webmaster Tools account for your website and proceed to the Crawl Errors, then to the Soft 404 tab. If you have none, this tab may not exist:
Since the update reportedly happened on October 25th, take a look around that date to see if there is a rise in Soft 404s. If so, this means that you have thin content issues and should address this immediately. Even if there isn’t a rise, all Soft 404s should be dealt with immediately.
What To Do
Where possible, turn the soft 404 pages into pages that return 404 response codes. If it is a 301 redirect that is causing the issue by redirecting an off topic page to another, either kill the 301 redirect or move it to a page that has relevant content and would still serve the users intent from the original page.
In cases where it would be awkward to show a 404 page, for instance if you are using faceted navigation but there are no results, adding a <meta robots=”noindex” > tag to the head section of the page or excluding those pages in robots.txt will solve the issue as well.