SEO Director at Pepperjam | eCommerce Organic Search Marketing Consultant en Pepperjam
19 de Jul de 2016•0 recomendaciones•758 vistas
1 de 28
SEO 101 - Google Search Console Explained
19 de Jul de 2016•0 recomendaciones•758 vistas
Descargar para leer sin conexión
In our third SEO 101 lesson, we discussed how we can use Google Search Console to earn more organic traffic. During this webinar we gave a high level overview of each section. We went into more depth during the Search Analytics section.
3. Quick Recap From Lesson 1 and Lesson 2
Win at SEO by focusing on your users and the outcome you want them to take on
If you build it they will come. If you build it slow, they won’t stay.
Watch Lesson 1: https://youtu.be/NjsRQwFOvlQ
Watch Lesson 2: https://youtu.be/l8lc2iPqeR4
4. Why Use Google Search Console?
1. Monitor your site's performance in Google Search results.
2. Ensure Google can access your content.
3. Submit new content for crawling and remove content.
4. Maintain your site with minimal disruption to search
5. Monitor for errors and resolve malware or spam issues so your
site stays clean
6. Discover how Google Search—and the world—sees your site:
7. Learn which queries caused your site to appear in search
8. Are your product prices, company contact info, or events
highlighted in rich search results?
9. Which sites are linking to your website?
10.Is your mobile site performing well for visitors searching on
5. How To Access Google Search
Click Add property
Follow instructions for verification
*Tip – Check out the Alternate Methods tab.
If you have the new version of Google Analytics
installed in your site, that’s good enough to
verify your site.
6. The Side Navigation
For today we are going to discuss the
Search Traffic and Crawl sections of Google
Search Traffic – This is where you can
spend time reporting for meaningful
Crawl – To monitor and configure how
Google will crawl your site.
8. Search Analytics
Get there: Search Traffic > Search Analytics
This is where you can view which search queries or
landing pages are bringing you traffic and how you can
optimize them for more. Using the filters is really how you
narrow in to see the data for where you should optimize.
Clicks – Traffic metric for how many clicks you’re getting
from organic search.
Impressions – How many times did one of your organic
results display on someone’s screen?
CTR – Click through rate is the click count divided by the
Position – The average position of the topmost result from
your site. So, for example, if your site has three results at
positions 2, 4, and 6, the position is reported as 2.
9. Using Search Analytics Filters
In this example reef is the brand name, so I wanted to see what were the non branded mobile
search queries bringing traffic to the pages where the url contained /shop/women. With this
data, I am able to fine tune our title tag to improve our click through rate (ctr) with the phrasing
that makes the most sense.
The radio button will display the dimension for the data you want to see.
TIP* You can still use the filters from another dimensions to
really narrow in on the data you want to see.
10. Finding pages with a below average
Search Traffic > Search Analytics
So you can sort by traffic, select Clicks and
1) Take note of your AVG CTR above the
2) Identify pages that are below average.
3) Export the list.
4) Revisit your meta title and description
for the page.
Bonus points for filtering by device type.
11. What To Look For?
Once we identified a page with
a lower than average ctr, I’ll
switch the dimension to show
me the queries.
From there I’ll learn exactly
how our audience is landing on
the page and work towards
optimizing the page (or creating
a separate one) to better align
with that message.
12. Deciding which Phrasing to use?
Once we are narrowed in, you can get a feel for the
queries that are bringing in clicks to the page. This
is the knowledge that will allow you to optimize your
title tags for phrases that your audience would most
likely click on.
Existing Title: Women's Surf-Inspired Sandals | Reef Women's
Better Title: Reef Women’s Flip Flops & Sandals
13. Links to Your Site
Get there: Search Traffic > Links to Your Site
Here you can view and download lists for which
domains are linking to you, the anchor text they’re
using, and which pages they’re linking to.
This data is valuable to monitor for abnormalities.
Each of the sections will let you narrow in to see
14. Internal Links
Get there: Search Traffic > Internal Links
The number of internal links pointing to a page is a
signal to search engines about the relative
importance of that page. If an important page does
not appear in this list, or if a less important page
has a relatively large number of internal links, you
should consider reviewing your internal link
Are your most important pages internally at the
top? This report will help you learn how Google is
16. International Targeting
Get there: Search Traffic > International
International Targeting will allow you to indicate to
Google which country you would like to target.
A tip for this section is that if you have a
subdirectory or section your site that targets a
specific country, you can set up a Google Search
Console Account for that Section.
You can add up to 1,000 properties to your
account, including both websites and mobile
apps. This means you can set up and geotarget
your subdirectories like http://example.com/us/ or
17. Mobile Usability
Get there: Search Traffic > Mobile Usability
The initial screen shows a count of pages exhibiting
specific mobile usability errors, grouped by type.
Click on an error type to see a list of pages affected by
the chosen error.
Click on a page URL to get a list of instructions on how
to fix the error.
Tip: Check out Google’s Web Fundamentals guide for
tips on how to remedy any of these issues.
19. Crawl Errors
Server error Googlebot couldn't access your URL, the request timed out, or
your site was busy.
Soft 404 A soft 404 occurs when your server returns a real page for a
URL that doesn't actually exist on your site.
Googlebot attempts to visit a page that doesn't exist—either
because you deleted or renamed it without redirecting the old
URL to a new page, or because of a typo in a link. What if there
are a lot? Check the top-ranking issues, fix those if possible,
and then move on.
Access denied In general, Google discovers content by following links from one
page to another. To crawl a page, Googlebot must be able to
access it. If you're seeing unexpected Access Denied errors, it
may be for the following reasons:
Get there: Crawl > Crawl Errors
22. Robots.txt Tester
This shows you whether your robots.txt file blocks
Google crawlers from specific URLs on your site.
As of October 2014 Google needs viewable access to
your site’s robots.txt directly harms how well our
algorithms render and index your content and can
result in suboptimal rankings.”
More info here:
Get there: Crawl > Robots.txt tester
A sitemap is a file you create for web crawlers, such as
Googlebot, that gives them a list of web pages to crawl
on your site.
Although most web crawlers can explore and discover
all the files on your site the sitemap serves as a helpful
guide for which pages to crawl (and how often).
In Google Search Console you can view, add, and test
sitemaps using the Sitemaps report.
You want your sitemap to be as close to perfect as
Get there: Crawl > Sitemaps
24. URL Parameters
You can use the URL Parameters tool to indicate the
purpose of the parameters you use on your site to
For example, maybe all the urls containing color=black
are duplicate urls. If so, then you can set preferences
for how Google might crawl the URLs that contain those
This is a strong clue to Google. However, ultimately you
will want your on page directives to be correct.
Get there: Crawl > URL Parameters