When migrating from HTTP to HTTPS, Google says to use 301 redirects


On a webmaster video hangout yesterday, Google trends analyst John Mueller strongly recommended that people migrating from HTTP to HTTPS do so with 301 redirects on a per-URL basis. He said you should not use other types of redirects, such as 303s or others; you should stick with 301 redirects for these migrations.

Back in August 2014, Google announced the HTTPS ranking boost, and since then, we have published several HTTPs migration plans and site migration tips.

Mueller explained that “if you start using other kinds of HTTPS result codes for redirects, then… we kind of have to reconsider and think ‘well, are they doing something unique here that’s not just a generic site move?’” He said that will then lead Google down the path of reprocessing each and every URL, which will result in making “these moves take a lot longer and make it a lot harder for us to just pass on all of the signals to the new version of the site.”

Here is the transcript from the video:

Can we also use a 303 status code after moving from HTTP to HTTPS or as only 301 recommended?

We strongly recommend use clean 301 redirect from on a per URL basis for HTTP migrations.

So you can use other types of redirects but the 301 redirect is really the one that we watch out for. And if we can recognize that it’s really a clean migration from HTTP to HTTPS, that all of the old URLs have moved to the new one, that you’re not removing things, that you’re not noindexing or robots.txt disallowing pages differently on HTTPS. Then that makes a lot easier for us to trust that as a kind of this one big thing of a site move that is moving from HTTP to HTTPS.

So the clearer you can tell us that this is really just a generic move and we don’t have to think about any of the details, the more likely we can just switch that over without you seeing any big change at all.

So if you start using other kind of HTTPS result codes for redirects then that makes it such that we kind of have to reconsider and think ‘well, are they doing something unique here that’s not just a generic site move?’ And then, at that point, we have to reprocess really each URL individually and think like ‘well, what is the webmaster trying to do here in this specific case.’ And that makes these moves take a lot longer and makes it a lot harder for us to just pass on all of the signals to the new version of the site.

Below is the video embed. He starts talking about this at 23:20 into the video:


About The Author

Barry Schwartz is Search Engine Land’s News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on SEM topics.





Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com