Site Migration Tool Set

The Site Migration Suite is a set of tools and best practices that were developed to ensure our consulting projects were as successful as possible.   When refreshing or migrating a site there are 4 factors of why the search performance fails after the update.

  •  SEO Best Practices not Integrated into the updated site
  •  Content not migrated effectively
  •  Redirects not identified and implemented correctly
  • Reindexing of new pages and content by Search Engines

Site Migration & Refresh SEO Playbook

This is a comprehensive guide on where and how to integrate SEO into the site refresh and migration process.  It SEO best practices are integrated before and during the build you are at least 90% guaranteed to have a well performing site post launch.  If they are not integrated the risk of failure is nearly 100%.

Keep and Delete Tool

One of the biggest challenges of a site migration and rebuild is to determine which content to migrate and which content to remove.   Too many times these decision are made without any data to tell you the impact of removing content.

For example, a F100 company recently was convinced by their development vendor that they should reduce their content on the site.  Internet users prefer to “graze content” and do not like to read.  This meant that the dozens of pages that the site had to help inform and nurture the visitor through the complex buying process were now considered extraneous.   We were able to use the Keep & Delete Tool to demonstrate how valuable these pages were.  Unfortunately, cool design won the argument and the client lost over 60% of their traffic.  With Keep and Delete model we could immediately pin point the most critical loss and replace that content recovering some of the traffic.

  • The Keep or Delete Tool takes input from a number of sources and processes it and allows the user to weight each element and based their own preferences.  The key elements used are:
  • Current Page Views of the
  • Current Traffic to the
  • Current Organic Search
  • Quantity and Quality of External Links to each Page

Each of those elements demonstrates the value of the page and its content.  Pages with high page views, high rank and a lot of quality back links are critical to migrate to the new site.  Failure to identify and migrate these pages will result in significant traffic decreases.

Redirect Monitoring and Validation

This tool functions as it sounds but far better than tools that check a single URL or even indicate a redirect during crawl.   There are two key functional uses

  • Redirect Confirmation for Site Refresh and Migrations
  • Multi Hop Bulk Redirect Checking

Redirect Confirmation – during a site migration to HTTPS or a major refresh the Search team will identify a list of URL’s and their destination and provide them to the Dev Team for implementation.  In my experience this process fails 100% of the time with one or typically many redirects not working correctly.

The Redirect Confirmation functionality allows the user to import their redirect matrix via Excel or CSV.  This simple file is the same one given to the Dev team containing the old URL, the desired header response and of course the destination URL.

The tool goes to each of the original URL’s and gets the header status.   If the header status does not match what was provided a fail message is recorded.   If the header is correct and it was set to 301 redirect we will capture the destination URL and confirm that it is the destination page.  If not it will not a fail and if the actual destination is a redirect will follow until gets a termination status of 200, 400 or 500.   The user can also check 410 “Gone” status as well if they wish to remove the page.

Bulk Redirect Checker – large and old sites have a bad habit of chaining redirects from migrations, rebuilds or acquisitions.  Then those redirects are then redirected.   Working with large companies it is not uncommon to see 3 to 6 chained redirects.   Unfortunately, there is no commercial tool that allows you monitor this for a large number of URL’s.

To use this tool, the user loads a list of URLs or an XML site map and the header is checked for all.  The tool records the header status, and the destination URL for each hop it encounters.   If not it will not a fail and if the actual destination is a redirect will follow until gets a termination status of 200, 400 or 500.

Once loaded, the user can rerun any rerun these reports to confirm that any errors were corrected without having to reload the original source.

 

Search Engine Index Monitoring

Once the new site is live, Search Engine Index Monitor is used to monitor the inclusion of the new and updated pages in Google.   A user can reference an XML site map, or list of URL’s for the previous site and the new site and using the info: syntax checks each URL to see if it has been included in Google’s index.   IT also captures the cache date so you know when the pages were indexed.   The user can see both indexed and non indexed URL’s.   Any URL that is not indexed they can use the Fetch as Googlebot to index it.

Executing the crawl on a sample of pages daily or the entire site weekly you can understand the visit and refresh rate of your content in Google.   This can be used to help predict when SEO changes to the pages might take effect.

Even if you are not refreshing or migrating a site the tool very effective to show you which f the URLs on your XML site map are not indexed.   Currently Google tells you how may on the site map an dhow many they have but no easy way to know which pages are missing.   If a page is not indeed it cannot rank and rive traffic.   Knowing which pages are not allows you to take corrective action.