Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

SMX East 2017 - Max Prin - JavaScript & PWAs - What SEOs Need To Know

769 visualizaciones

Publicado el

SMX East 2017 - Max Prin - JavaScript & Progressive Web Apps - What SEOs Need To Know

Publicado en: Marketing
  • Sé el primero en comentar

SMX East 2017 - Max Prin - JavaScript & PWAs - What SEOs Need To Know

  1. 1. #SMX #32A @maxxeight What SEOs Need To Know JavaScript & Progressive Web Apps (PWAs)
  2. 2. #SMX #32A @maxxeight
  3. 3. #SMX #32A @maxxeight What’s a Web App?
  4. 4. #SMX #32A @maxxeight Traditional Page Lifecycle Web Application Lifecycle What’s a Web App? Initial GET request POST request HTML HTML Initial GET request AJAX call HTML (App shell) JSON, HTML, etc.
  5. 5. #SMX #32A @maxxeight What’s a Progressive Web App? NativeApps Web Apps
  6. 6. #SMX #32A @maxxeight What’s a Progressive Web App? NativeApps Web Apps
  7. 7. #SMX #32A @maxxeight Reliable & Fast App shell cached locally (on 1st load) • Fast loading when offline or with slow connection (on subsequent loads) Mobile-friendly (responsive) Secure (HTTPS) What’s a Progressive Web App? Engaging Bookmark (icon) on device’s homepage Push notifications
  8. 8. #SMX #32A @maxxeight What’s a Progressive Web App?
  9. 9. #SMX #32A @maxxeight WHAT ABOUT ACCESSIBILITY FOR SEARCH ENGINE BOTS?
  10. 10. #SMX #32A @maxxeight What’s a Progressive Web App? NativeApps Web Apps
  11. 11. #SMX #32A @maxxeight How Search Engines Typically Work Render
  12. 12. #SMX #32A @maxxeight Issues for all crawlers  Potentially a unique URL (or non-crawlable URLs)  A unique HTML document (the “app shell”) – Same <head> section (title, meta and link tags, etc.) Issues for crawlers other than Google (and Baidu)  Client-side rendering of content (HTML source code vs. DOM) Web Apps (SPAs, PWAs)
  13. 13. #SMX #32A @maxxeight  Crawling – 1 unique “clean” URL per piece of content (and vice-versa) Making Sure Search Engines Can Understand Your Pages
  14. 14. #SMX #32A @maxxeight Crawling: Provide “Clean”/Crawlable URLs Fragment Identifier: example.com/#url – Not supported. Ignored. URL = example.com Hashbang: example.com/#!url (pretty URL) – Google and Bing will request: example.com/?_escaped_fragment_=url (ugly URL) – The escaped_fragment URL should return an HTML snapshot Clean URL: example.com/url – Leveraging the pushState function from the History API – Must return a 200 status code when loaded directly
  15. 15. #SMX #32A @maxxeight  Crawling – 1 unique “clean” URL per piece of content (and vice-versa) – onclick + window.location ≠ <a href=”link.html”> Making Sure Search Engines Can Understand Your Pages
  16. 16. #SMX #32A @maxxeight  Crawling – 1 unique “clean” URL per piece of content (and vice-versa) – onclick + window.location ≠ <a href=”link.html”>  Rendering – Don’t block JavaScript resources via robots.txt Making Sure Search Engines Can Understand Your Pages
  17. 17. #SMX #32A @maxxeight  Crawling – 1 unique “clean” URL per piece of content (and vice-versa) – onclick + window.location ≠ <a href=”link.html”>  Rendering – Don’t block JavaScript resources via robots.txt – Load content automatically, not based on user interaction (click, mouseover, scroll) Making Sure Search Engines Can Understand Your Pages
  18. 18. #SMX #32A @maxxeight Rendering: Load Content Automatically
  19. 19. #SMX #32A @maxxeight  Crawling – 1 unique “clean” URL per piece of content (and vice-versa) – onclick + window.location ≠ <a href=”link.html”>  Rendering – Don’t block JavaScript resources via robots.txt – Load content automatically, not based on user interaction (click, mouseover, scroll) – For Bing and other crawlers: HTML snapshots Making Sure Search Engines Can Understand Your Pages
  20. 20. #SMX #32A @maxxeight  Crawling – 1 unique “clean” URL per piece of content (and vice-versa) – onclick + window.location ≠ <a href=”link.html”>  Rendering – Don’t block JavaScript resources via robots.txt – Load content automatically, not based on user interaction (click, mouseover, scroll) – For Bing and other crawlers: HTML snapshots  Indexing – Avoid duplicate <head> section elements (title, meta description, etc.) Making Sure Search Engines Can Understand Your Pages
  21. 21. #SMX #32A @maxxeight Main content gets rendered here Same title, description, canonical tag, etc. for every URL
  22. 22. #SMX #32A @maxxeight Tools
  23. 23. #SMX #32A @maxxeight
  24. 24. #SMX #32A @maxxeight
  25. 25. #SMX #32A @maxxeight
  26. 26. #SMX #32A @maxxeight SEO Crawlers Rendering Web Pages Merkle’s proprietary crawler
  27. 27. #SMX #32A @maxxeight LEARN MORE: UPCOMING @SMX EVENTS THANK YOU! SEE YOU AT THE NEXT #SMX

×