Great SEO is increasingly dependent on having a website with a great user experience. To make your user experience great requires carefully tracking what people do so that you always know where to improve. But what do you track? In this presentation from MozCon 2017, learn three effective and advanced ways to use event tracking in Google Analytics to understand a website's UX.
7. #MozCon | @ElementiveSltns
I don’t need
UX I just
need rankings
or traffic
Another
consultant is
responsible
for UX
8. #MozCon | @ElementiveSltns
I don’t need
UX I just
need rankings
or traffic
Another
consultant is
responsible
for UX
Hands off-
your job stops
at the SERP
70. #MozCon | @ElementiveSltns
CODE TO DOWNLOAD
Dynamic Click Track
http://jsfiddle.net/aschottmuller/38kqp83b/
Modifications To Dynamic Click Track
http://www.matthewedgar.net/easy-way-track-clicks/
UX Events (scroll patterns, time, navigation)
http://www.elementive.com/marketing-resources/user-experience-event-tracking-code/
Parsnip Scroll Depth
http://scrolldepth.parsnip.io/
Notas del editor
UX matters a great deal for SEO. I think everybody in this room knows that and certainly we’ve seen Google give more priority to UX with a focus
on speed, mobile friendliness, and quality content.
But this isn’t really new because good user experience starts with making your website easy to find.
That means using the right content with the right keywords as well as giving people a compelling reason to visit your site in the title and description tag. It has always been important to make your site something useful for the person who clicks on the search result.
But, for many of the clients we work with, they don’t see that connection. I’ve been told that I don’t need UX – I just need rankings or traffic.
Or, I was told that some other consultant or agency was responsible for UX.
Just focus on SEO and keeps your hands off user experience—your job stops at the SERP.
And even for those who do see the connection, they struggled with how do you measure “good UX”?
Isn’t UX hard to quantify? It is a nebulous concept and seems very relative
with SEO you can point to clear, obvious things to measure like higher rankings, more links, more traffic, or higher domain authority.
But, where are the clear, obvious metrics for UX?
So I took this as a challenge for how to show the impact of UX and it’s connection on SEO. How do you measure it? How can I prove this matters and is worth investing in?
Now there is no shortage of methods out there to measure user experience. Usability tests or surveys are two common methods. While those are really useful and beneficial,
how do you connect those tests or surveys to your SEO? As well, tests and surveys are typically invested in during big redesign projects
but like good SEO, good UX is a long-term and ongoing process.
There are heatmaps or recordings which get closer to a way of showing this. You can segment heatmaps by source and look at what people from organic search are doing. And it is interesting to know people scrolled here or clicked there. But I don’t know how it proves the worth of good content or good design? How does the heatmap prove to a client SEO and UX go together?
As a better way to measure UX and its connection to SEO, I turned to event tracking in Google Analytics
There are lots of things you can measure, but there are three I wanted to share that I found that were the most valuable in quantifying the impact of good content, good design – a good user experience .
Also, I’ll have links to all the code on the slides on SlideShare for you to download.
So, let’s start with measuring How People Browse
One of the key parts to creating a good user experience is giving people the ability to find whatever it is they want to find as easily as possible.
We want to be able to track all the different ways people navigate our site. To start, we can track all the links that get clicked on.
There is a a great script that does this for outgoing links, jump links, and downloads . This script binds the click event tracking to all links on the site without having to manually tag them.
But with a quick tweak you can do this for all internal links as well. So, now, along with seeing the outgoing links, downloads, or jump links,
we can also see all the internal links people click on when visiting our website as an event category alongside the other types of links already tracked in this script.
But just because they click doesn’t tell us too much. To understand the experience people have, we need to know more.
Starting with how long did it take people to find that link they clicked? To get this, we can tweak that click tracking script we were just looking at to have a timer.
We can pass the time people were on the page before they clicked to the event script -- relying on visibilitychange, we can measure the time people were actively on the page.
Now we know something way more interesting than if they clicked – – we know how long it took them to click
And we can pull this into a table and group the information about each link. That way we can see the average minutes before people click a given link on our page. Maybe it is great people spent so much time before they clicked the third link on this page because that means they read our site and clicked the link at the end of an article. Or maybe it stinks that people spent so much time before clicking on the second link because it means that link is so buried people really struggled to locate it.
But now the time, we can start asking the question if it is too much, too little, or just the right amount of time. We have a way to quantify how people browse.
But we can get even more information here. We also want to know something about activity beyond the click.
For this, we could use sequences in Google Analytics.
But a faster way I’ve found is to set a cookie on click that collects information about how long people spent on the subsequent page they went to on our site.
The question is Did they stay on the other page? Or not? If they clicked and then clicked away really quick or they left our site altogether, that might mean we need some work on the content or design on the page they clicked to not just on the landing page.
In this example , we can see that people who arrive on my site’s home page and click to another page on the site spend a little over a minute on the subsequent page.
But drilling in on this a bit more people who click from my home page to the about page spend less time on that subsequent page—so the landing page is okay, but we need to investigate this page they are clicking to. If people leave the site after a super short amount of time on the page, there is a chance they went back to Google to find a different search result for a different website, which would likely impact rankings and performance. Not to mention it also impacts the bottom line because you are losing potential customers.
So, the second set of code to share has to do with measuring the value of a blog post or the content you create.
Because, when you think about a good user experience, there is a lot more to it than just clicking. We have pages or posts without links to click – instead,
we want people to actually read and engage with the content on those pages. We want to know if people spent time with our content. If they scrolled through our content. If they saw everything we want them to see.
Parsnip has a great event tracking script that measures scroll activity. You can see how far people scrolled down the page . If most people only scroll 50%, that probably suggests people aren’t really reading or looking at everything this site has to offer. Instead, people are probably going back to Google to find something else.
Parsnip along with other scripts also let you track how many people saw a specific part of the page. For instance, we might want to know how many people scrolled to see a call to action on our landing page. You can tag that section with a unique ID. So, in this example, we can see all the people who scrolled to our lower call to action.
But beyond just scrolling to it, we also want to know how long it took people to get there. So, a modified scroll tracking script other than Parsnip can allow us to track the time it takes to scroll to the element.
In this case people scrolled to this lower call to action in about 39 seconds. But that isn’t all that complete – we want to know how long people spent looking at that part of our page.
So this alternative scroll tracking script also tracks the time people spent with this element. In this example, people stayed on our lower call to action about 36 seconds. If this area we’re looking at is a paragraph, you can decide if that is enough time for people to read what is there or not.
But in our example, this is a call to action. Beyond the time spent, we also want to know if people clicked on this call to action. So, we can track clicks as well
and we can see that there were 11 clicks from the 23 people who scrolled to this part of the page—not a bad click through rate, so clearly this lower call to action is something relevant to people reaching this part of the page.
While that is all very helpful, we also want to know the bigger picture. If you watch people scroll during a usability test, they don’t just scroll down.
They scroll down and then they scroll back up and down again—it isn’t just one direction. So, while this code is great,
there is a modification we make at Elementive that shows not just how far people reached but also the scroll point they were at when they left the site
What this tells us is that people scrolled, in this case, people on average scroll about 70% of the way down the page as the max scroll but
when they exited the site, they were only 40% of the way down the page. If you add in the secondary dimension of browser size, you can really tell what is at that 40% mark.
So, If I’m thinking about ways to improve my user experience and keep people on my site, I want to focus on the part of the page people left my site from. Along with telling you about the content you might want to adjust here, this is also the part of the page people are at before they exit is probably a part they are interested in or have a question about. So the information contained at this part of the page might be good information to include in description tags or in links you build referencing this page. This tells you something about who your visitors are and tells you something about the people your organization works with and what they are interested in--which influences SEO but also marketing, sales, product, and all aspects of the organization
Finally, let’s talk about a way to keep track of and identify errors.
Beyond knowing navigation and how people interact with content,
a big part of a good user experience is getting rid of errors or because some errors occur, we want to make sure people can recover from the errors.
There are technical errors where something breaks but we’re also talking about slips and mistakes.
One of the areas where you see technical errors, slips, and mistakes is on forms.
So, no matter what type of form we’re talking about, we want to know what errors are creating a bad user experience that prevents people from completing the form and converting.
And how this script works is when an error is triggered for the visitor, we pass that identification number specific to that particular error as the event label. This way we can give technical errors like international numbers failing to verify one ID number and slips like a phone number being in a wrong format another ID number.
and then in Google Analytics we can see all the various combinations of errors people got.
So, we can count these up in Excel and what this lets us do is we can see which errors are the most frequent . This tells us what we need to fix on the form technically. It also tells us that we may need to adjust the fields or the labels on the fields to make it easier to fill it out.
But we also want to see the continuation beyond the error. Do people who see this error still convert? If so, that error, slip, or mistake isn’t such a big deal since it isn’t harming conversions or really preventing the visitor from doing something on our site.
So, to know this we can segment by converters of this form to see which errors people move beyond and add that information to our table.
In this case, we can see that 22 people saw the error we’ve labeled as ID #5 but all but 2 of those people converted during their session. Sure we could fix this error, but it isn’t preventing conversions so maybe it isn’t worth the time required.
But, 29 people got error ID 4 and only 7 filled out the form. Clearly, whatever error ID 4 is becomes a much bigger priority we need to fix to improve our site’s user experience.
Now, this really tells you something impactful about SEO and UX. Because with this you can show that you got traffic to the site from search, you even got them to the form, and you can show why people aren’t converting. It isn’t because of the wrong keywords targeted or because the SERP didn’t qualify traffic. Instead, it is because of this specific issue with a misleading form field or a confusing field label or a technical error.
But even more than telling us something SEO and UX, what this really tells is how we can tie UX to our client’s bottom line. In this client’s case, a bad user experience on this form caused by confusing field labels on their form directly leads to a reduction in leads and sales. By showing that to the client, you show why UX matters.
It isn’t just a nebulous concept—it is a concept that can be clearly measured and the impact of a good UX on your SEO and on the business more broadly can be clearly shown.