Saturday, December 17, 2022
HomeMarketingRobots.txt That Return 500/503 HTTP Standing Code For Prolonged Time Will Take...

Robots.txt That Return 500/503 HTTP Standing Code For Prolonged Time Will Take away Your Web site From Google


Gary Illyes from Google mentioned on LinkedIn that in case your server returns a 500/503 HTTP standing code for an prolonged time period in your robots.txt file, then Google could take away your website utterly from Google Search.

That is even when the remainder of your website is accessible and never returning a 500 or 503 standing code.

It isn’t only a 500/503 HTTP standing code that you’ll want to fear about, it’s also a problem in case your website does these community timeout points.

Once more, it must be for an “prolonged time period,” which was not outlined, however I assume it’s greater than only a day or two.

Gary wrote, “A robots.txt file that returns a 500/503 HTTP standing code for an prolonged time period will take away your website from search outcomes, even when the remainder of the positioning is accessible to Googlebot.” “Similar goes for community timeouts,” Gary added.

Gary referred to the HTTP docs and added, “if we will not decide what’s within the robotstxt file and the server would not inform us a robotstxt file would not exist, it will be far more hurtful to crawl as if the whole lot was allowed (eg. we would index martin’s awkward hat footage unintentionally).”

We all know that Google recommends utilizing a 503 server standing code for when your web site goes offline or down briefly for lower than a day (hours not a number of days). If it goes offline for longer, than attempt to add a static model of your website as an alternative.

Simply watch out with lengthy down time – not that you just probably have a selection.

Discussion board dialogue at LinkedIn.



Supply hyperlink

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments