Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Migration Best Practices - SMX London 2018

5.699 visualizaciones

Publicado el

My talk from SMX 2018 in London covering best practices on how to successfully naviate through the various types of migrations (protocal migrations, frontend migrations, etc.) from an SEO perspective - mainly focussing on all things tech.

Publicado en: Tecnología

Migration Best Practices - SMX London 2018

  1. 1. @basgr from @peakaceag#SMX #29A Bastian Grimm, Peak Ace AG | @basgr Successfully relaunching your website >> the tech edition Migration Best Practices
  2. 2. @basgr from @peakaceag#SMX #29A Never change a running system!
  3. 3. 3 @peakaceag from @peakaceag#SMX #29A Every (big) change brings (loads of) opportunity! But always keep in mind: the price tag for failure is immense! Risk Reward User acceptance “Once in a lifetime” opportunity Project complexity Greenfield project: question everything Resilience/freeze Best chance to really “get shit done” Interruption Opportunity to eliminate “legacy problems” Politics Usually more agile vs regular, daily business Performance Rethink RWD/dynamic serving, HTTPS, URL design, etc.
  4. 4. 4 @peakaceag from @peakaceag#SMX #29A Migration types and their potential impact on SEO Often these types overlap – or multiple things are done at once. Inspired by @jonoalderson: Type Example Hosting migrations You’re changing hosting or CDN provider(s). You’re changing, adding, or removing server locations. You’re changing your tech stack/caching/lbs. Software migrations You’re changing CMS (or its version/plugins, etc.). You’re changing the language used to render the website. You’re merging platforms; e.g., a blog which operated on a separate domain. Domain migrations You’re changing the main domain of your website. You’re buying/adding new domains/subdomains to your ecosystem. You’re moving a website, or part of a website, between domains. Template migrations You’re changing the layout/structure/navigation of important pages. You’re adding or removing template components. You’re changing elements in your code, like title, canonical, or hreflang tags. … and there are more: design migrations, strategy migrations, content migrations, protocol migrations, etc.
  5. 5. 5 @peakaceag from @peakaceag#SMX #29A Make it a granular, multi-step approach Doing everything at once will make debugging & rolling back an almost impossible task! Source:
  6. 6. 6 @peakaceag from @peakaceag#SMX #29A What is your goal for the migration? The right mindset is super, super important! I want to lose as little as possible!” – isn’t really the right goal for a migration!
  7. 7. @basgr from @peakaceag#SMX #29A Thorough documentation, in-depth definition of requirements and ongoing testing are essential! Be crazy about details
  8. 8. @basgr from @peakaceag#SMX #29A …from someone who has successfully done this type of work before. Ask for help…
  9. 9. 9 @peakaceag from @peakaceag#SMX #29A ▪ Google Search Console: manual actions, server errors (DNS, 5XX response codes), mark-up validation errors (AMP,, rich cards), robots.txt ▪ Web crawl: internal redirects as well as redirect chains, broken URLs, and internal links ▪ Log files: broken URLs, suspicious status codes, crawler traps ▪ Algo issues? Relocating with Panda, Penguin & Co. makes very little sense Pre-migration site health check & clean-up A properly optimised domain migrates easier and more efficiently. Getting your house in order before the move minimises the risk of losing rankings.
  10. 10. 10 @peakaceag from @peakaceag#SMX #29A Be careful with broken pages & broken internal linking
  11. 11. 11 @peakaceag from @peakaceag#SMX #29A You will need an arsenal of tools! A lot depends on personal taste, but you’ll need at least one tool each for crawling, log file analysis as well as search intelligence – and yes, this costs money!
  12. 12. @basgr from @peakaceag#SMX #29A Figure out well in advance how to get access to all relevant server access logs, e.g., native access or using a SaaS solution. Get access to server logs
  13. 13. 13 @peakaceag from @peakaceag#SMX #29A Gather all URLs including static assets #1 Must haves: log files, XML sitemaps as well as a full website crawl Extras: analytics (top ranking URLs and/or URLs generating the most traffic) Mode > List > Upload small domains large domains
  14. 14. 14 @peakaceag from @peakaceag#SMX #29A Gather all URLs including static assets #2 Additionally: get the URLs that are strongly linked, bring a lot of traffic and /or have been shared the most, etc.
  15. 15. 15 @peakaceag from @peakaceag#SMX #29A Switch to monitor your keyword rankings daily Check critical keywords daily to ensure you are only working with the most recent data.
  16. 16. 16 @peakaceag from @peakaceag#SMX #29A Establishing a status quo performance benchmark Lighthouse (via Chrome DevTools) or provide relevant metrics. Important: don’t just test the homepage, but also category/product pages.
  17. 17. @basgr from @peakaceag#SMX #29A As a rule of thumb: 1-to-1 redirects from old to new! Prepare URL redirect mapping
  18. 18. 18 @peakaceag from @peakaceag#SMX #29A Provide staging/test server Make sure the server is locked-down properly to ensure your content doesn’t get indexed in advance (i.a. duplicate content problems). Methodology Pros Cons noindex (meta tag/header) - External tools can access without separate access rules - URLs are definitely not indexed - Indexing rules cannot fully be tested (all noindex) - Waste of crawl budget robots.txt - External tools can access without separate access rules - No crawl budget is wasted - Indexing rules cannot fully be tested (only with robots.txt override) - If linked, test URLs may appear in the index (without title/metas). password secured (.htaccess) - No crawl budget is wasted - URLs are definitely not indexed - Everything can be tested properly - External tools must be able to handle password authentication. IP-based access - No crawl budget is wasted - URLs are definitely not indexed - Everything can be tested properly - External tools must be able to handle IP-based authentication. VPN - Absolutely safe! - So safe, only a few tools can handle it!
  19. 19. 19 @peakaceag from @peakaceag#SMX #29A Build your very own migration Q&A check-list Depending on what type of changes you’re undergoing, this needs to be adapted. Use Aleyda’s template for more inspiration: Canonical tags & other rel-alternate annotations Remember to annotate your dedicated mobile site and to adapt your RSS feeds Stage: Pre-migration Who? Bastian When? 23.05.18 Multilingual setup: Customize hreflang target URLs Keep in mind: Various locations can be affected (e.g. head section, server headers, xml sitemaps) Stage: Pre-migration Who? Bastian When? 23.05.18 Update pel=next/prev pagination annotations Are you using Google´s recommendation for pagination? Stage: Pre-migration Who? Bastian When? 23.05.18 Update XML Sitemaps Sitemap index file needs to be changed also, if you reference it in robots.txt Stage: Pre-migration Who? Bastian When? 23.05.18 Structured data update ( Update your mark-up references. Short annotations like “//“ don´t validate! Stage: Pre-migration Who? Bastian When? 23.05.18 Update CDN settings and resource hints Update requests for assets to CDNs & any resource hints (preconnect, dns-prefetch) Stage: Pre-migration Who? Bastian When? 23.05.18 Update HTTP header & customize cookie settings If applicable, customize X-Robots header tags. Use Chrome DevTools! Stage: Pre-migration Who? Bastian When? 23.05.18
  20. 20. 20 @peakaceag from @peakaceag#SMX #29A Update internal links Simply relying on redirects is no migration strategy ▪ Links to other internal URLs ▪ Links to internal image files ▪ Links to internal video files ▪ Links to internal web fonts ▪ Links to internal JavaScript files ▪ Links to internal CSS files JS HTML Source code
  21. 21. 21 @peakaceag from @peakaceag#SMX #29A Update internal links (within JavaScript files) Simply relying on redirects is no migration strategy ▪ Links to other internal URLs ▪ Links to internal CSS files JAVASCRIPT Files ▪ Links to internal image files
  22. 22. 22 @peakaceag from @peakaceag#SMX #29A Update internal links (within CSS files) Simply relying on redirects is no migration strategy ▪ Links to internal image files ▪ Links to internal web fonts CSS Files ▪ Any other internal links
  23. 23. 23 @peakaceag from @peakaceag#SMX #29A Be careful with internal redirects! Avoid redirect chains: old URLs should lead directly to the corresponding new URL Source: Redirect Chain Report via DeepCrawl
  24. 24. @basgr from @peakaceag#SMX #29A Better safe than sorry: let’s test crawl the staging server and run a comparison Everything updated?
  25. 25. @basgr from @peakaceag#SMX #29A Work your search console
  26. 26. 26 @peakaceag from @peakaceag#SMX #29A Prerequisites for a migration: ▪ No manual actions ▪ No significant amount of crawl errors (DNS, availability) ▪ No problems with the XML sitemaps ▪ Valid structured data/rich cards mark-up, no AMP and hreflang errors Create new properties & eventually merge them into a set Properties for HTTP and HTTPS with and without www = 4 domains If you use separate mobile domains, there are 6 domains in total!
  27. 27. 27 @peakaceag from @peakaceag#SMX #29A Transfer disavow links file in time Especially for domains with a “questionable” link profile: GSC setup and disavow file transfer should be done approximately 48 hours before going live!
  28. 28. 28 @peakaceag from @peakaceag#SMX #29A Set your preferred domain and (if necessary) crawl rate You don’t need to set the crawl rate, but, if you have done so previously, I would suggest transferring the original settings (for now).
  29. 29. 29 @peakaceag from @peakaceag#SMX #29A Configure geo targeting for gTLDs Synchronise settings for handling URL parameters if necessary (i.e. for “.com” gTLDs)
  30. 30. @basgr from @peakaceag#SMX #29A After all this preparation, all you need to do now is to "just" redirect URLs. 301 redirect all the things!
  31. 31. 31 @peakaceag from @peakaceag#SMX #29A Even if, as some Googlers say, all redirects are equal… …all we did for this client was change the (chained) 302/307 to 301 redirects!
  32. 32. @basgr from @peakaceag#SMX #29A Note: it only works for domain migrations! Use the GSC “site move” feature
  33. 33. 33 @peakaceag from @peakaceag#SMX #29A Before you send to the index: test GSCs fetch & render Make sure Google includes and displays all requested components properly!
  34. 34. 34 @peakaceag from @peakaceag#SMX #29A Test and re-submit all XML sitemaps Also: synchronise URL parameter settings, if you were using them or if you need them for your site’s functionality.
  35. 35. 35 @peakaceag from @peakaceag#SMX #29A List crawl of old URLs & manual SERP checks Import old URLs, e.g., to ScreamingFrog (list mode), only 301s should appear here. Additional manual check of indexed URLs, e.g., via LinkClump add-on in Chrome. 1 3 4 2
  36. 36. @basgr from @peakaceag#SMX #29A The most common problem with migrations gone wrong? Missing or wrong redirects! Seriously: check your redirects
  37. 37. @basgr from @peakaceag#SMX #29A More post migration to-do‘s
  38. 38. @basgr from @peakaceag#SMX #29A Focus on 4XX and 5XX status codes first, tackle those “live”! Log file, GSC & GA error monitoring
  39. 39. 39 @peakaceag from @peakaceag#SMX #29A #1 Migration performance monitoring HTTP error status code monitoring (e.g., 40X for Googlebot & Bingbot)
  40. 40. 40 @peakaceag from @peakaceag#SMX #29A #2 Migration performance monitoring HTTP redirects over time and split by user-agent containing *bot*
  41. 41. 41 @peakaceag from @peakaceag#SMX #29A #3 Migration performance monitoring Top crawled pages breakdown (daily & weekly)
  42. 42. 42 @peakaceag from @peakaceag#SMX #29A Prefer an easier way? ScreamingFrog Log File Analyzer
  43. 43. @basgr from @peakaceag#SMX #29A Re-test: no broken URLs (4/5XXer), tracking in place, correct metadata & other tags, indexing rules, etc. Complete crawl of the new domain
  44. 44. 44 @peakaceag from @peakaceag#SMX #29A Test and apply an “if... then... logic” Based on your previously built Q&A checklist, make sure to double-check everything! IF Google news listings are available, then test the Google news sitemap. IF you work internationally, then test hreflang tags/sitemap annotations. IF ratings (featured snippets) are available, then test mark-up. IF PPC ads are shown, then update and test the landing page URLs. IF SSL is also used on other subdomains, then test these as well. etc.
  45. 45. 45 @peakaceag from @peakaceag#SMX #29A AMP, structured data, and rich cards error monitoring
  46. 46. @basgr from @peakaceag#SMX #29A Short-term peaks are completely normal, dramatic drops rather not! Keep an eye on the crawl frequency
  47. 47. 47 @peakaceag from @peakaceag#SMX #29A Compare performances side-by-side HTTPs is usually a little bit slower: compare your results. Clear goal: never slower than before (always use HTTP/2 for HTTPs) Try it out:
  48. 48. @basgr from @peakaceag#SMX #29A Some tips to make your life easier
  49. 49. 49 @peakaceag from @peakaceag#SMX #29A #1 Bulk testing all the things: mobile-friendliness Mobile-friendliness at scale: Check it out:
  50. 50. 50 @peakaceag from @peakaceag#SMX #29A #2 Bulk testing all the things: hreflang tags hreflang tags (in sitemaps) at scale: Check it out:
  51. 51. 51 @peakaceag from @peakaceag#SMX #29A #3 Bulk testing all the things: redirects & other headers HTTP status codes (errors, redirects, etc.) at scale: Check it out:
  52. 52. 52 @peakaceag from @peakaceag#SMX #29A #4 Simulate Googlebot for smartphones with JS-rendering ScreamingFrog can do that easily at scale; pay close attention to rendered output! Also pretty cool: Extract > Xpath > //head/link[@rel="amphtml"]/@href
  53. 53. 53 @peakaceag from @peakaceag#SMX #29A #5 Don’t forget to redirect your images as well! When changing URLs/domains, make sure to implement redirect rules for images. Read the entire post:
  54. 54. 54 @peakaceag from @peakaceag#SMX #29A #6 Move your robots.txt file When changing domains, make sure to transfer (the contents of) robots.txt!
  55. 55. 55 @peakaceag from @peakaceag#SMX #29A #7 HTTP 503 is your friend Combine with “revisit-after” to throttle crawling; never use “noindex”/4XX instead! Webmasters should return a 503 HTTP header for all the URLs participating in the blackout […] Googlebot's crawling rate will drop when it sees a spike in 503 […] as Googlebot is currently configured, it will halt all crawling of the site if the site’s robots.txt file returns a 503 status code for robots.txt.” Source:
  56. 56. 56 @peakaceag from @peakaceag#SMX #29A #8 GSC’s DNS verification can be pretty helpful No need to worry about missing meta tags; plus you can verify before deploying a site/frontend to a new domain – and it’s faster as well! More:
  57. 57. 57 @peakaceag from @peakaceag#SMX #29A #9 Fix those redirect chains, especially on legacy sites… …as multiple requests waste valuable performance and crawl budget!
  58. 58. 58 @peakaceag from @peakaceag#SMX #29A #10 Careful: JavaScript frameworks are still tricky Check out Bartosz‘ massive research on crawlability and indexability! Read more:
  59. 59. 59 @peakaceag from @peakaceag#SMX #29A Always looking for talent! Check out Bastian Grimm Liked the deck? Here you go: