Business Advice | 9 MINUTE READ

Your 9-Step Website Migration Process to Avoid an SEO Nightmare

website migration process
Pat Ahern Junto Denver Marketing
Pat Ahern
Pat Ahern is a partner at Junto - the digital marketing agency that is powered by vetted freelancers from across the world. Pat loves rock climbing, craft beer and helping startups grow.

“I can’t wait to migrate this 10,000-page website to a new CMS!”

Said no one, ever.

Website migrations are the ugly step-child of the web development world. They’re not glamorous and one small oversight can have massive consequences on a company’s search traffic.

That being said, migrations can be easy to complete with the right website migration process.

Unfortunately, far too many web developers overlook one critical element in the process: search engine optimization.

We get it: you’re not an SEO. Your job is to make a great website.

But what if launching that website results in your company losing half of their online business? Will they still see that website as great?

Half of their online business, you say?

Yep, because that’s how much of your total website traffic comes from organic search.

Far too many developers overlook simple SEO elements that will destroy a company’s search rankings.

Today, we’re going to delve into the SEO side of the website migration process that we live by at Junto. We’re going to give you our step-by-step website migration checklist to ensure that you cover the most essential SEO elements.

The goal of this post isn’t to double your company’s sales. The goal of this post is to ensure that you know the basic SEO boxes to check off during a migration. Following this guide will give you the framework to prevent a migration project from destroying your business.

Please note, if you’re changing your domain name, there will be a few additional steps to take care of.

1. Scrape all existing site data

Far too many developers fail to save a copy of the most essential information from the original site.

So what “information” are we referring to?

  • A backup of MySQL and WP-Content files
  • A full list of the indexed pages on the original site (all working pages, 301 and 302 redirects, 404 errors, and 410 headers)
  • Title tags and meta descriptions for all working pages
  • Canonical tags for all working pages


Why do you need to save a copy of this site data?

Short answer: this is your safeguard in case anything goes wrong.

We’ve seen websites that lost all their title tags and meta descriptions because the development team forgot to copy these to the new site.

If these developers had saved a copy of the original meta data, they could have copied and pasted this into the new site. Instead, those sites saw their search rankings plummet. Google no longer understood what each page on their site was actually about.

Failing to note any redirects on the original site will result in a massive spike in broken pages on the new site. This will hurts search rankings and user experience.

404 errors are broken pages. These old URLs should be redirected on the new site. From there, migrate 410 headers to the .htaccess file on the new site. 410 headers tell search engines that a page should be removed from search results. Identifying these pages helps search engines to dedicate more resources to the most valuable pages on your site.

So how do you pull a copy of this data?

Our suggested tool is Screaming Frog. The free version allows you to pull all the above data for up to 500 pages on a website. The paid version (~$160/year) removes the page limit.

2. Discourage search engines from crawling your staging site

We’ve seen dozens of web dev teams forget to hide their staging site from search engines. Far too often, these staging sites will end up getting indexed while the new site is still in progress.

This is a serious issue for 2 reasons:

1. Short term: customers who search for your company could land on the staging site. This can hurt your company’s brand image.

2. Long term: if Google indexes the staging site with the new website copy, Google will view the staging site as the original source of that copy. As a result, the staging site will appear in search results instead of the new site.

Yep, that’s right. If the staging site is indexed, your brand new site could be removed from search results.

Ok, so how do I see if Google indexed my staging site?

Type the following into Google, replacing “yourdomain” with the domain name that you host your staging sites on:

Is your staging website indexed?

Do any of the URLs from staging sites pop up in search results?

If so, remove those URLs from search results using Google Search Console. See “How to Remove Low-Quality Content” in this SEO errors guide that we put together for more on how to do this.

So how do I prevent this from happening during my next site migration?

Short answer: set up a robots.txt file.

Robots.txt file for a website migration

The “Disallow: /” line tells search engines to ignore all pages on that particular subdomain.

NOTE: search engines occasionally ignore this file, so it’s always a good idea to see if Google indexed your staging site before launching the new site. For those who want to take this a step farther, set meta robots tags to “noindex, nofollow” for each page on the staging site.

Related: see how we helped one client to see a 118% increase in conversions after a website redesign.

3. Compare the existing vs. new page hierarchy

Compare the page hierarchy for the new site to the structure of the original site. While comparing the two versions, ask yourself the following questions:

  • Which URLs will change from the existing site to the new site?
  • Which pages will be removed from the original site?

Pull up the list of URLs that you scraped from the original site with Screaming Frog. Paste them into a doc like this one:

Content migration during a site migration

Take the full list of pages that you will remove from the new site. Identify if you should redirect those URLs to a working page on the new site, or remove them completely using a 410 header.

Not sure whether to redirect or 410 these pages? Shoot me a note and I’ll be happy to share my feedback.

4. Set up redirects on the staging site

Note the pages in the above list that will need to be redirected. We recommend that those working with WordPress sites use Yoast Premium (paid) or Redirection (free) to set up these redirects to the new URLs.

These redirects can also be set up in the .htaccess file. However, making a mistake in your .htaccess file can lead to serious site issues.

NOTE: we won’t spend long talking about copy scraping, QA testing, etc. for the staging site. However, here are a few secrets that we’ve learned from our website launches that save time and keep our clients way happier:

  1. If working with WordPress, create a duplicate version of the WordPress database or use the WordPress Importer function to import all existing site copy (rather than trying to manually copy over data).
  2. Use a keyword ranking tool to evaluate keyword ranking changes of your most valuable keywords during this migration.

Our team uses SEMRush for monitoring keyword ranking changes. However, there are dozens of other great tools for doing this as well.

Evaluate any pages that see a massive ranking drop after the site migration first. These pages often contain technical SEO errors.

5. Set up your XML sitemap

A sitemap is a roadmap that tells search engines how to navigate through a site. Creating a sitemap for the new version of your site is essential.

So how do you create a sitemap?

If you’re building the site on WordPress, save yourself time and use the Yoast plugin to create a sitemap for you (both the free and paid version create one for you).

For those who aren’t working with WordPress, Screaming Frog provides a powerful sitemap builder as well.

Sitemap builder tool

6. Launch the new site

After taking care of the above pre-launch items, you’ll be ready from an SEO perspective to launch the new site.

The last 3 items delve into the essential post-launch SEO to-dos.

SEO Services

7. Update robots.txt and meta robots

The single biggest SEO mistake that a developer can make during a site migration is forgetting to update indexability settings.

While working on the staging site, we told Google not to index your site. Now, we need to allow Google to index your site.

Forgetting this step can prevent the new site from showing up in search results. This will cause a massive drop in search rankings.

So how do we allow Google to index our site?

First, we update the robots.txt file to remove the line that states “Disallow: /”.

Second, we change the meta robots settings from “noindex, nofollow” to “index, follow” to (almost) every page on the site.

There are a few exceptions here. You’ll want to hide a few pages from search engines on the new site:

  • “Thank You” pages – pages that site visitors are sent to after filling out a contact form or downloading a piece of gated content.
  • Login pages – leaving login pages indexable can leave your site vulnerable to brute force login attempts.
  • Taxonomy pages (tags, categories, etc.) – this is heavily debated amongst SEOs. Our take is to evaluate search rankings of these types of pages first. Deindex any of these pages that lack valuable search rankings. Keep any of these pages indexed that do hold valuable search rankings.

8. Scan the new site for SEO errors

After updating meta robots and the robots.txt file, run a crawl of the new site to identify any outstanding SEO errors.

We recommend using SEMRush’s audit report to crawl the entire site for any outstanding errors.

For those who prefer to search manually, here are the most pressing post-migration errors to watch out for:

1. 4xx errors

Crawl the newly indexed site using Screaming Frog. Identify any 404 or other 4xx errors. Redirect (301 or 302) or deindex (410) each of these remaining broken pages.

2. Broken internal links & external links

Broken links become a serious issue whenever the URL structure of a site changes. Developers often forget to 301 redirect old pages to the new ones.  You can identify these broken links using the Check My Links Chrome extension. Remove these broken links from the site.

3. Duplicate content

Duplicate content is an underrated SEO issue that can destroy a site’s organic traffic. Search engines could take 2-4 weeks to crawl your new site, and search engines won’t catch these issues until that time. As a result, this issue should be checked 2-4 weeks after launching the new site.

Do a site search for your website. Note the number that pops up below the search bar in the section labeled “About ____ results”

Duplicate content SEO error

Now scroll to the bottom of the search engine results page (SERPs) and skip to the last page of search results. Do you see the following error message?

duplicate content in search results

If so, take the number that appears in place of 393 in the above screenshot. Subtract that from the first number in the original search results. The difference is the number of duplicate content issues that are being flagged by Google.

Not sure how to fix this? Schedule a call and I’ll walk you through how to get this resolved.

4. WWW. resolve issues

Type “” into your browser. Now type “” into your browser.

Does one version of the site redirect to the other? If not, update your .htaccess file to get one of the 2 versions to redirect to the other.

9. Set up Search Console

Once you’ve resolved the above issues, the final step is to get the new version of the site indexed.

Go to Google Search Console (formerly Google Webmaster Tools) and verify your ownership of your site. Our team then handles a dozen valuable SEO-related items including:

  • Setting the preferred version of the website
  • Checking for security issues
  • Evaluating mobile usability
  • And more

However, the only essential item to take care of is submitting your sitemap to Google.

You can do this by selecting “Crawl”, and then selecting “Sitemaps” from the dropdown list.

Sitemap test Google search console

Select “ADD/TEST SITEMAP” in the top right corner and add the relative URL of your sitemap file(s).

And just like that, you’re good to go!

So to recap everything above, here are the biggest SEO to-dos for a dev team during a website migration:

  1. Scrape all existing site data
  2. Discourage search engines from crawling your staging site
  3. Compare the existing vs. new page hierarchy
  4. Set up redirects on the staging site
  5. Set up your sitemap
  6. Launch the new site
  7. Update robots.txt and meta robots
  8. Scan the newly launched site for SEO errors
  9. Set up Search Console

Still have questions or concerns about the SEO to-dos for a website migration? Comment below or reach out on Twitter and we’ll do our best to help.

In the meantime, sign up for our newsletter to be the first to read our insights.

Subscribe to our Monthly Newsletter

Join our monthly newsletter to stay up-to-date on the latest insights and trends in digital marketing.

Denver Traffic Generation Junto Logo Junto Whitelabel Traffic Generation Logo Junto Gear Logo #2

Related Posts

Brand Marketing
Business Advice | 8 minute read
What is Brand Marketing? [And Why Does it Matter?]
Table of Contents What Defines a Brand? ...
Content Marketing Benefits
Business Advice | 10 MINUTE READ
The 13 Unique Benefits of Content Marketing
It seems like advertisements are everywhere nowadays. Whether blocking us from watching a YouTube video, or disguising themselves as blog ...
A soccer playing taking a penalty kick
How To's | 9 MINUTE READ
Google Penalty Recovery: Common Google Penalties And How to Fix Them
Google is constantly updating their namesake search engine to improve the user experience. The ultimate goal of these updates is ...