Search Engine Optimization for Business Succeess
Leticia Ferrer Mur BA Project Digital Concept Development Team 11 29th of August 2016 KEA
This report is prepared as a bachelor project for Digital Concept Develop-
ment at KEA - Copenhagen School of Design and Technology.
My subject choices during the second semester were E-Communication,
E-Commerce and E- Business Technology.
Throughout my second semester, and during the process of writing this
final project, I am doing my internship in a digital agency called AW Media.
My position in the company is SEO (Search Engine Optimization) imple-
menter. I am working on one of the agency’s lead sites in United Kingdom.
All along this period I have been attending to weekly courses that taught
me all about Search Engines (SE), SEO and many techniques to succeed
The internship consist of five intensive months that allowed me to acquire
a considerable amount of quality empirical knowledge and related perso-
nal reflections. Being a part of AW Media team, I feel included in the SEO
world. And my constant aspiration for succeeding at it fed my curiosity. The
same curiosity that made me start this project. With this project I want to
create a great practice guideline for SEO.
Table of Content
1.1 Problem Area 1
1.2 Problem Formulation 1
1.3 Theory and Methodology 1
1.4 Delimitations 2
2. Situation 2
2.1 How does Google Works? 2
2.1.1 Crawling and Indexing 3
2.1.2 Ranking 3
2.1.3 Content Evaluation 3
2.1.4 Google’s Algorithm and its Updates 3
2.2 What is Search Engine Optimization and How it Can Affect Search
2.2.1 Differences Between: On-site SEO and Off-site SEO. 5
2.2.2 Differences between: White-hat SEO and Black-hat SEO 6
2.3 Trends: Users Behaviours 6
2.3.1 How Users Behave on Google? 6
2.3.2 How Users Behave Inside of a Site? 8
2.3.3 How Google Measures Users Behaviours. 8
3. Strategy: SEO Best Practices Guideline 10
3.1 On-Site SEO 10
3.1.1 Importance of Quality Content: 10
22.214.171.124 Keyword Strategy 10
126.96.36.199 How to write a Content Page 11
188.8.131.52 Create a Blog 11
184.108.40.206 Having Google Panda Under Consideration 11
3.1.2 Web Design Optimization 12
220.127.116.11 Importance of Responsive Web Design (RWD) 13
3.2 Off-Site SEO 13
3.2.1 Linking Strategy 13
3.2.2 Significance of Social Media Presence 15
3.2.3 Competitor Analysis 16
3.3 Track Results 17
3.3.1 Digital Listening 17
4. Conclusions 17
5. Bibliography 18
6. Glossary of Terms 20
Communication, as a way of information and knowledge exchange between peo-
ple, has changed considerably with the passage of time. The oldest way of long
distance communication was smoke signals used already in 200 BC to send mes-
sages along the Great Wall of China. And from this moment the evolution never
stopped. In the 12th century, Sultan Nur al-Din sent carrier pigeons to bring mes-
sages to cities hundred kilometers away. Later, in 1440 Johannes Gutenberg de-
veloped the printing in Europe, which opened a new era of access to knowledge.
Posterior, in 1844, Morse sent his first telegraph message from Washington D.C.
to Baltimore Maryland. Subsequently, in 1950’s most of the houses had landlines
that allowed to make only one call at a time (1). Following, in the 1980’s, Dial-up
Internet access appeared as the technological base of the modern Internet. Fi-
nally in the 1990’s, the World Wide Web (WWW) appears as a way of connecting
computers around the world, enabling a worldwide chance of creating and acces-
sing the knowledge/content that have never been seen before.(2)
1.1 Problem Area
The new era of digital content is a scenario where, the huge amount of data is an
obstacle for users that are trying to access quality content, and for communica-
tors that want to share their knowledge with the proper target. With over 40% of
the population with Internet access (over 3 billion users) and more than 1 billion
websites opened at this moment, the use of SE is dominant, and more specifically
the use of Google with over 3.5 billion searches per day (3). But this technology
is not enough. Even if communicators post quality content, the amount of existing
content in the Internet interferes with the chance of being shown to the user. The-
refore, what an SEO or webmaster should do in order to make his content more
accessible and attractive than others?
This project will analyze the actual communication and content situation in order
to offer a best practice guideline to make webmasters connect with their audience
applying SEO techniques.
1.2 Problem Formulation
How can webmasters communicate with their audience efficiently by creating
quality content and building trust utilizing SEO (Search Engine Optimization)?
- How does a SE works?
- How can SEO help webmasters?
- Does Social Media works hand in hand with SEO?
- How do users behave in this new era of digital content?
1.3 Theory and Methodology
In the second chapter I am going to carry out a qualitative evaluative research.
With this study I want to show clearly the actual situation of SEs as content and
communication spread tools.
Qualitative desktop research
The main research covered in Chapter 2 - Situation part - is going to be desktop
research. There are really good sources of SEO professionals sharing quality
content on the Internet and in books. Nevertheless, some other methods are to
be used during this first qualitative research.
In order to gain secondary empirical data, a professional interviews (over email)
has been carried out with two SEO professionals. The two professionals are:
- Marco Antonio Varela: SEO professional in Spain.
- Jens Ulrik Lange: partner at Overskrift. Works with digital listening.
1 Samsung Galaxy. “From Smoke Signals to Smartphones: The Evolution of Communication.” Mashable. N.p., 05 Dec. 2014. Web. 10
Apr. 2016. <http://mashable.com/2014/12/05/evolution-of-communication-brandspeak/#UiFdD_XvHZql>.
2 “Dial-up Internet Access.” Wikipedia. Wikimedia Foundation, n.d. Web. 16 May 2016. <https://en.wikipedia.org/wiki/Dial-up_Inter-
3 “Internet Live Stats - Internet Usage & Social Media Statistics.” Internet Live Stats - Internet Usage & Social Media Statistics. N.p.,
n.d. Web. 24 May 2016. <http://www.internetlivestats.com/>.
Being an European webmaster in the age of algorithms requires a wide and up-
to-date knowledge about the most relevant SE of the moment, Google. When
having a website, it is crucial to gain a good position in Google’s SERPs in order
to be easily found by customers and therefore reach the huge success that the
digital presence offers.
2.1 How Does Google Works?
Google is the most popular SE between users in 2016. It was created in 1996 by
Sergey Brin and Larry Page as an internal database browser for Stanford Univer-
sity, and was called “BackRab”. However, in September 1997 they changed the
name to “Google” and registered it as a domain. Then, it became a Web search
engine. The reason for the new name was the mathematical term “googol” which
means the number one followed by one hundred zeros, the name reflected the
wish of organizing an amount of information apparently infinite on the Web. (5)
The concept of SE mechanism was kind of easy. They created an algorithm that
recognized the sites which received the bigger amount of external references
from other sites, meaning that those sites were the most relevant for users. They
explained it by saying: “We have built a large-scale search engine which addres-
ses many of the problems of existing systems. It makes especially heavy use of
the additional structure present in hypertext to provide much higher quality search
Nevertheless, over time, this concept has been evolving and changing steadily
due to external reasons.
Matt Cutts, former head of the web spam team at Google, works with the search
quality team on SEO issues. In his YouTube video “How does Google Search
work?” (7) he gives a really good explanation of the system:
Google crawls the web comprehensively and deeply. Then, it indexes all the pa-
ges, ranks them and returns them to the user, showing the most relevant first. A
breakdown of this information needs to be done:
In order to gain empirical evidence about the user behaviour, an observation
research utilizing Google Analytics, Webmaster Tools and Ahrefs will take part
along the entire process of writing this project, in order to confirm some of the
Moreover, personal empiric evidence acquired during my internship will be incor-
porated into the observation process.
All the information related to Google and it’s algorithm is extremely confidential
and only experienced professionals and empiric tests, will give me the closest
approach to this interesting data.
The huge amount of data saved in the Internet, fosters the emergence of many
different SEs. The most popular worldwide SE between users in January 2016
were: Google with a 88.36% of the market share, Bing with a 4.75%, Yahoo with
3.3% and Baidu with a 0.73% (4).
Therefore, this project will delimitate the study to the main SE, Google.
4 “Global Search Engine Market Share 2016 | Statistic.” Statista. N.p., n.d. Web. 10 Apr. 2016. <http://www.statista.com/statis-
5 “Our History in Depth – Company – Google.” Our History in Depth – Company – Google. N.p., n.d. Web. 22 May 2016. <https://
6 “The Anatomy of a Large-Scale Hypertextual Web Search Engine.” The Anatomy of a Search Engine. N.p., n.d. Web. 22 May 2016.
7 GoogleWebmasterHelp. “How Does Google Search Work?” YouTube. YouTube, 23 Apr. 2012. Web. 10 Apr. 2016. <https://www.
2.1.1 Crawling and Indexing
SE have as a main goal, to offer users the best result to their queries. Therefore,
the first thing a SE have to do, is finding all the public content available in the
World Wide Web. This is known as crawling the web. The SE starts visiting all the
sites with high quality content, and follows up visiting the links that finds inside
these main sites and therefore discover new pages.
Thanks to the links’ net created between all sites in the Web, the SE’s robots,
called crawlers or spiders, can find the huge trillions of existing sites.
This crawling procedure is really complex, so SEs can not crawl the entire Web
every day. Moreover, spiders choose not to crawl certain sites due to their irrele-
The first step of this great process is to create a huge index. This index is a big
database that includes every relevant term appearing in each of the crawled sites.
Therefore, when a user types a query (keyword that user writes in the search box
of the SE), the spider is able to find the best result inside of this massive directory.
This index also includes all the links that the crawled sites have within their pages.
The way this sites refer to other external sites, can be done in form of a URL link
or anchor text link (clickable text).
Ranking, is the position where Google locates a site in the Search Engine Result
Pages (SERPs), every time a user types a query. Once the user types some
query in the Google searching box, the SE is making two actions in less than a
second. First, finds in it’s index what it considers the most relevant pages related
with the user’s seaking. After, it shows the user the different results ranked depen-
ding on the site’s relevance or importance.
The relevance depends on the site’s content. The more related the content is with
the user’s query, the more relevant it seems for the spider. The importance of a
site is measured depending of the amount of related sites linking to it. If other si-
tes, with related content, link to our site, it says to the spider that these other sites
value our content, thus, they refer to it. (8)
8 Enge, Eric, Stephan M. Spencer, and Jessie Stricchiola. “2. Search Engine Basics.” The Art of SEO: Mastering Search Engine
Optimization. 3rd ed. N.p.: O´Reilly, 2015. N. pag. Print.
One particular site can have different ranking positions depending on its diffe-
rent keywords. For example, a esthetic clinic based in London can have different
keywords such as “breast surgery”, “lipo” or “esthetic clinic in London”. This same
site can appear in the first SERP when a user types “breast surgery”, but appears
in the third SERP when a user types “esthetic clinic in London”. This difference
between keyword rankings will vary depending on the competitor’s SEO strate-
2.1.3 Content Evaluation
Google wants to offer high quality results to it’s users. Therefore, it’s sophisticated
algorithm is evolving greatly in order to find out the kind of content, topic, tone of
voice, keywords used and its synonyms on the content, between an other untold
and unknown amount of factors that webmasters are trying to guess in order to
make the best website optimization.
In order to make this content evaluation, Google’s spider will evaluate the relevan-
ce of the site depending on its content. When the spider crawls the web, it makes
a deep analysis of words and sentences. The spider will gather all this information
and make a map of this data called semantic map. When a user types a query, it
needs to be a semantic match between the query and the content of the site. This
is the reason why webmaster should find out the dominant keywords that will be
used on the site in order to make spiders find the kind of content the site offers.
2.1.4 Google’s Algorithm and its updates
The web is full of information. When a user needs to find something, and types
it into the search bar, Google algorithms work to offer the best results. An algori-
thm is a management software that search for clues that will help users find their
As mentioned earlier, the reason why Google started updating the algorithm is
due to the emergence in 2001 of webmasters willing to cheat the SE’s ranking
results. By updating the algorithm, the webmasters lost track of the way it worked
and therefore could not develop a way of cheating it.
First appeared Goodle Jump, this consisted of making Google’s algorithm change
massively once in awhile, to make SEO more difficult.
The Pigeon update was launched in July 2014 for improve the location search
quality. Before this update, the Google’s web search results were different from
the Google Maps results. After the Google Pigeon was launched, the results were
much more cohesive and accurate.
The reason why this update was designed was to offer a real result to users about
business location in order to avoid an increasing cheating of the system. Some
businesses realized that putting the business address in a crowded area of the
city, will bring more traffic to the site because people always searched for stores
or business that are within the metropolitan area. The new refined updated makes
location search reliable.
Hummingbird is more than an algorithm update, it is a remake of the whole algori-
thm. This new version appeared in 2013, it still uses some of the old updates like
Panda or Penguin but stopped using some of the old updates. (11)
The big difference between the old algorithm and this new remake, is the way
it interprets the user’s query - what one types in the search box. Thus, instead
of analyzing each word of the sentence query, this new remake analyzes the
sentence in a whole and gives a proper interpretation of the user’s need. It will
also give an answer depending on the user’s location, if the user shared this with
it. (Using Google Chrome the users is sharing a big amount of information with
Google launched this update in April 2012 in order to assure a better quality re-
sult for users in their SE. This update particularly cares about the quality of the
inbound links to a site. Since having a big amount of inbound links was giving
sites a better position in Google’s SERPs, people found the way to cheat the al-
gorithm by creating link farms (a big net of empty sites that were linking between
themselves in a way that gave value to them and therefore deliver some of this
value to the sites that they were offering linked references in exchange of money).
Many sites started ranking really fast so Google realized and launched Google
Penguin, which penalized all this malicious pages and removed all their value or
even removed them from the SERPs.
11 Sullivan, Danny. “FAQ: All About The New Google “Hummingbird” Algorithm.”Search Engine Land. N.p., 26 Sept. 2013. Web. 22 May
After that, Google Dance was introduced instead. Alternatively to making big
changes once in awhile, it made a lot of little changes every day.
Nowadays, Google uses a combination of both methods for offering quality ran-
king results. There are many little updates and some big ones from time to time.
Google’s algorithm is converting questions into answers, and it is based on more
than 200 unique signals or “clues” that enables guessing what the users are really
looking for. These clues include facts like site’s keywords, actuality of the content,
geographic location of the user, and user’s last searches, among others. (9)
In 2016 Google operates with many different algorithm’s updates in order to pro-
vide the user a good quality results and make more difficult to webmaster to
defraud the algorithm. These Google Updates are a constant change of the al-
gorithms that Google keeps secret in order to avoid possibility of malicious result
Here is a list of the main updates starting from the new ones and finishing with the
older ones that are still in use and getting updated:
In October 2015 appears RankBrain algorithm, a machine-learning artificial inte-
lligence. A system that learns from humans, builds on what it already knows, and
develops new connexions. Therefore, the algorithm is able to understand and
interpret people’s queries and return results that do not necessarily contain the
keyword but are closely related.
According to Google, this algorithm is the third most important factor when
showing results in SERPs. But this algorithm is not working completely by itself,
it is part of the main search algorithm called Hummingbird (10).
10 Sullivan, Danny. “FAQ: All About The New Google RankBrain Algorithm.” N.p., 27 Oct. 29015. Web. 20 Mar. 2016. <http%3A%-
This updated is a search filter introduced in February 2011 and meant to find
the quality of the sites content. It determines the quality of the content depen-
ding on many facts, the most important being: the content length, the relevance,
repetition, or the keyword natural density within the text (not too much keyword
stuffing and not always using the keyword purely but also combining with some
The need of launching this update was due to the high amount of low quality
content existing in the Web that was ranking really high in SERPs. People started
creating really long content pages with high keyword density (in an unnatural
way) and with a lot of repeated content along the site.
Those are only the main Google algorithm updates. As said before, there are
continuous changes and updates in the core algorithm that affect the quality of the
results in the SERPs. Webmasters are trying to know every update for the sake
of their sites optimization.
In order to avoid some SEO bad practices, Google never reveals in detail how
these updates can affect their sites, only gives a brief update description.
2.2 What is Search Engine Optimization and How Can it Affect Search En-
When Google started as a Web SE in 1997, it was working perfectly. The goal of
showing first the results with bigger amount of references worked. But everything
changed in 2001 with the apparition of SEO implementers (webmasters). Glen
Michelsen, SEO professional at AW Media, explains that, professional webmas-
ters realised how to cheat the algorithm and therefore some sites started being
ranked better that others only by paying to link farms and getting inbound links
from unrelated websites. Therefore, Google started to build an algorithm to pre-
vent these cheatings. It launched an update that penalized sites using these bad
Google is nowadays a really complex and enormous database that gathers all
the information on the Web and gives it back to the user depending on her exact
needs. All this procedure is done by spiders that are in charge of analyzing, inde-
xing and making semantical matches with all the sites in the internet, so it can give
the right information back to users when they are typing a query. But, since there
are trillions of sites within the Web, making a site visible for the spiders, requires
some expertise technics. These expertises are called SEOs or webmasters.
SEO is a kind of marketing that cares about the position that a site has in the
ranking of the SERPs, in order to increase traffic to this particular site. In other
words, make the site visible for spiders and increase the organic traffic to the site.
2.2.1 Differences between: On-site SEO and Off-site SEO.
SEOs have to work in two different phases in order to keep a site visible for SE
spiders. These phases are on-site SEO and off-site SEO, and both have symbio-
tic relation because they have an important impact into each other.
On-site SEO refers to all the performance an SEO can make inside of a site in
order to make it appealing for spiders and bring traffic. The main task is to crea-
te quality and optimized content. Besides this, setting a proper site design, that
makes the user experience easy and intuitive in different devices, is also part of
the on-site SEO tasks.
Off-site SEO applies to the work an SEO makes outside a site. Some of the main
tasks are: backlinking, inbound linking, social media presence, guest blogging
and press releases.
SEOs jobs are not isolated projects but ongoing processes. Both on-site and
off-site techniques require a steady effort that can not be stopped. The regular
tasks that are to be done in both areas are keeping the ranking position in Google,
that will immediately decrease if the exercise concludes. (12)
12 “The Difference Between On-Site vs. Off-Site SEO -.” Inbound Now. N.p., 29 Oct. 2012. Web. 7 May 2016. <https://www.inbound-
2.2.2 Differences between: White-hat SEO and Black-hat SEO
So far, it is obvious that there are many factors that affect the ranking position
of the sites in the SERPs. Therefore, it is really important for SEOs to keep their
sites optimized so they can rank high and therefore get good traffic. Nevertheless,
there are two types of SEO practices called white-hat and black-hat SEO.
The white-hat practices refers to the good practices that will bring a site to the top
naturally by providing a great content for users.
The black-hat ones are the ones that consist in understanding the algorithm up-
dates in order to cheat them and find their ways to SERP’s top in a unnatural and
There is a third hat color, grey. This grey-hat SEO practices combine both of the
previous ones. Many webmaster agree on combining white-hate and grey-hat
SEO practices in order to succeed. This combination will be a priority of the whi-
te-hat, but having under consideration some disloyal techniques that won’t harm
the site at all and will be done in order to be one step ahead the competitors.
2.3 Trends: Users Behaviours
As any other product, a SE needs to be constantly improved in order to adequate
it to the latest needs and also avoid possible external factors that could adversely
affect the service offered. These negative external factors are basically the black-
hat SEO practices that impair the proper use of the tool.
2.3.1 How Users Behave on Google?
Search engines, and to be concrete Google, has sophisticated algorithms that
work minutely to offer the perfect result for users in SERPs. Therefore, the SERPs’
interfaces are constantly changing to offer the user a perfect searching experien-
ce. But, with the change of the interface, the users behave changes as well.
In order to compare with the old users behaviour, we can see an older eye-trac-
king study made in 2006 by Nielsen Norman showed a pattern in users behaviour
that was called the “F-pattern”. The eye and mouse tended to be directed to the
top left corner of the SERP and continued going down making an “F” pattern (see
(Figure 1) “F-Pattern”. Source: nngroup.com (13)
Nevertheless, a more recent study made in 2014 by Marketing Professor Ayaz
Nanji shows that the pattern has changed and users behave in a new way with
the new Google’s SERP interface.
13 Jacob Nielsen. “F-Shaped Pattern For Reading Web Content.” Nielsen Norman Group. N.p., 17 Apr. 2006. Web. 23 May 2016.
The new behaviour described by Nanji: “the top organic result still captures about
the same amount of click activity (32.8%) as it did in 2005. However, with the
addition of new SERP elements, the top result is not viewed for as long, or by as
many people. Organic results that are positioned in the 2nd through 4th slots now
receive a significantly higher share of clicks than in 2005.” (14)
(Figure 2) New user behaviour in SERPs. Source: marketingprofs.com (14)
Comparing it to the latest study made in 2016 by Alex Brikett from Conversion
XL Institute about users behaviour in SEs, showed that trends are changing. The
study was an eye-tracking with a sample of 61 Google users. As a result, can be
seen that the top organic result are still the ones having bigger amount of clicks,
with a result of 20,69% of the clicks, followed by the 3rd organic result that had
6,897% of the clicks and continuing with the 2nd organic result that had 1,724%
of the clicks. Here we can see that the progressive F-pattern has changed.
14 Ayaz Nanji. “Eye-Tracking Study: How Users View Google Search Result Pages.” MarketingProfs. N.p., 6 Oct. 2014. Web. 24
May 2016. <http%3A%2F%2Fwww.marketingprofs.com%2Fcharts%2F2014%2F26167%2Feye-tracking-study-how-users-view-goo-
(Figure 3) Evolution of users behaviour. Source: conversionxl.com
(Figure 4) New user behaviour in SERPs. Source: conversionxl.com (15)
Some other interesting results of the eye-tracking analysis made by Alex Birkett
(Figure 3 and 4) shows that users stayed around 7’8 seconds exploring the SERP
above the fold (the part of the screen that the user sees before start scrolling
down) in Goolge, and took them an average of 7’1 seconds to get down the fold.
Nevertheless, rich data (advertisement appearing on the top right corner of the
SERP) is receiving 39 % of the clicks, followed by the first organic result, which
receives 20% of the clicks. (15)
2.3.2 How Users Behave Inside of a Site?
When SEOs create content, the procedure needs to follow SE spiders’ require-
ments and users’ needs at the same time. The moment a user enters in our site is
crucial. The amount of time that she spends in our site will be determined by the
relevance of the content (if it is what she was looking for), the functionality and
easy interaction with the site design, and the quality of the content.
If the user finds in our site what she was looking for, she will stay longer and go
through big part of our content. If not, she will go back to the SERP once she arri-
ved to our landing page, and that will increase the bounce rate of our site.
“Bounce rate is the percentage of single-page visits or visits in which the person
left your site from the entrance (landing) page” - Google Webmaster Help (16)
Thus, the spider will understand that the user did not find your site relevant for
his seeking and therefore left the page. Incrementing the bounce rate, decreases
the domain authority of the site, and makes our site take a worse position in the
2.3.3 How Google Measures Users Behaviours
In order to have data about user’s behavior when entering into a site, Google
have some metrics that will gather information every time a user visits a site.
Once all this data is collected, the algorithm will use it to place the site in the pro-
per position of the SERP.
Click Through Rate (CTR)
Click through rate (CTR) is the main metric Google uses for measuring users
behaviour together with the bounce rate. CTR metrics count the amount of users
that click the title of a site in the SERP. By clicking this title, the user is conside-
ring that this site has the most relevant content for her query. Therefore, this click
counts like a vote. (17)
16 “Bounce Rate.” - Google Analytics Help. N.p., n.d. Web. 25 Aug. 2016. <https://support.google.com/analytics/answer/1009409?hl=en>.
17 Neil Patel. “How User Behavior Affects SEO.” Http://neilpatel.com. N.p., 18 Oct. 2015. Web. 6 Apr. 2016. <http://neilpatel.
15 Alex Birkett. “F-Patterns No More: How People View Google & Bing Search Results.” ConversionXL. N.p., 02 Mar. 2016. Web. 25
May 2016. <http://conversionxl.com/how-people-view-search-results/>.
Therefore, if users’ CTR increases, it shows to the spider that the site is relevant.
Nevertheless, spiders will also consider the amount of time a user spends in the
site. Thus, if a user finds the snippet related with her query but, once she is inside
the site, the content is not what she expected, she will exit the site, come back
to the SE, and the bounce rate will increase. This behaviour will be understood
by the spider as a bad user experience, and will decrease the ranking of the site.
CTR metric gives spiders a quality data of users behaviours, but only if the users
arrive to the SERP where the site appears. That is called an impression (when
a user sees the snippet of a site, so has the chance of click it). Thus, if the site
appears in page 10 and the user do not arrive to see it, there is no impression,
and the spider can not use the CTR metric to evaluate users behaviours.
(Figure 5) Amount of CTR depending on the SERP number (18)
18 “Google Organic CTR History.” Https://www.advancedwebranking.com. Https://www.advancedwebranking.com/ctrstudy/, 5 June
2016. Web. 5 June 2016.
The previous chart shows the amount of users clicks (CTR) depending on the
site’s position within the Google SERPs’ ranking. It can be seen that sites appea-
ring in the first page (SERP) of Google results, has an average of 30 clicks both in
desktop and mobile devices. The amount of clicks in the second page decreases
considerably to approximately 15 clicks. And it continues to decline as the user
moves on in pages. That means a big amount of users do not go further than
page 3 in most of the cases. Therefore, having a site appearing after page 3, will
decrease considerably the chances of getting visits.
Another relevant metric that Google uses to track users behaviors in order to offer
them better results, is the path navigation. It follows the user from the landing
page, the route taken and the exit page.
Depending on the keyword that the user types in the searching box, the results in
the SERP will bring the user to the page related with the keyword. Therefore, the
user can arrive to the site starting in the home page, a subpage or an article in the
blog. Then, the page that the user start her visit is called landing page.
Once the user is seeking for her needs, she is following a route inside the page.
At some point, the user will find what she need or she will figure out that she can
not find what she was looking for, and she will exit. The page that she is at when
she decides to exit, is called exit page. If the exit page is the purchase confirma-
tion in a online store, or a registration form in a portal that will mean that the user
converts (found what she needs) and thus she left. That would be a successful
user experience. Contrarily, if the user exits in a short period of time without going
through the site, that will be a bad user experience and the site ranking position
It is possible to make our own user behaviour tracking in Google Analytics. There
we can find the percentage of visits that the different pages receive (percentage
of landing pages). Moreover, we can see the bounce rate, that will let us know the
users leaving our site too soon.
The current European digital scenario is based on the strongest SE of the mo-
ment, Google. Therefore, for a website, having a proper presence on Google’s
first three SERPs, will make a difference between a website that succeeds or
barely exists. Moreover, the constant and unknown Google Updates for improving
users experience, and the users behaviour steadily evolution, challenge webmas-
ters to offer the best quality websites for users.
3. Strategy: SEO Best Practices Guideline
Different tasks an SEO needs to accomplish in order to keep a site optimized, are
divided into two groups, on-site SEO and off-site SEO. This means that there is
some optimizations that are to be done inside the site and some other different
tasks that are to be done outside the page. During this third chapter, the word
Domain Rating and Page rating will be use in order to talk about the value that
Google might be giving to a site or a specific page inside of a site. The name has
been given by AhRefs (SEO report tool) as its own calculation metrics for deter-
mining how a domain is likely to rank on Google.
3.1 On-Site SEO
As said before, on-site SEO refers to all the techniques an SEO professional can
carry on within a site. The main tasks will be to create a great content, optimize
the web design and consider a responsive web design (RWD - adapt it to any kind
of device screen).
3.1.1 Importance of Quality Content:
The content quality of a site is crucial for bring natural traffic to it. Nevertheless,
when starting a website it is important to gain a foothold between the huge amount
of sites with similar content that exists on the Web. Therefore, creating a content
marketing strategy is a way to find a proper ranking position for a site and conti-
nue from there. The best way to start is by creating quality content that will bring
the most of our traffic in a natural way but there are some other things to keep in
mind when creating the content of a site.
18.104.22.168 Keyword Strategy
The first thing to have under consideration when writing content for a website is
to find the main keywords that describe the content of the site. Those keywords
are relevant because they will need to match users’ queries. Therefore, when a
user is typing a search queries in Google, the words she uses inside the search
box, should match the keyword in our site in order to make our site appear within
To make sure we are using the right keyword (the one people are more likely to
type) there are many tools available (index 1) that will show the current trends and
give us a hand in here.
The keyword can have many forms:
- Branded Searches: When users already know our site so they will type the name
of our root domain.
- Anchor Searches: When users are looking for a specific product. They will type
the specific name of the product.
- Keyword Specific Searches/Long Tail: These searches, are a multiple keyword
search (many typed words) that cover a very specific need of a user.
- Keyword Specific Searches/Micro Searches: these kind of searches are also
very specific but are typed in a very short form, with a couple of words normally.
- News and Trending Searches: include any kind of new trend, industry news or
Those are not the only kind of existing keyword forms but the most relevant and
commonly used ones. (19)
Having said this, after choosing the keywords that are on trend (the ones that
users type more often) within our topic, we should take into consideration the
importance of making them match, not only with users needs, but also with our
content. There are some useful tools for get keyword suggestions such as Über-
suggest, Google Suggest, Google Keyword Planner (needs a login), SEMrush
(freemium version) or Keyword Explorer. Those tools are pretty easy to use. By
typing the topic of your site, the tool will give you suggestions of the related que-
ries that users are typing lately on Google when they are looking for something
The keywords are talking about our content so we will use them properly and na-
turally through our content. It is important to take into consideration that Google’s
algorithm is able to recognise synonyms of those keywords, meaning that we
should definitely try to use synonyms and not only the dominant keyword in order
to create a natural content that users will like and Google will value.
19 Fishkin, Rand. “A Step-by-Step Process for Discovering and Prioritizing the Best Keywords - Whiteboard Friday.” Moz. N.p., 6 May
2016. Web. 25 May 2016. <https://moz.com/blog/discovering-prioritizing-best-keywords-whiteboard-friday>.
20 McDonald, Jason. SEO Fitness Workbook: The Seven Steps to Search Engine Optimization Success on Google. N.p.:page 80
n.p., n.d. Print.
Once the keywords are meticulously chosen, it is time to make a site map. This
map will gave us the structure of the site. This structure shows which are the main
content pages (VIPs) and which ones are the Level 1, Level 2 pages.
The main content pages are the ones talking about the site’s main topics, and
should be titled using one of the selected keywords. Thus, each of the main con-
tent pages (VIPs) should have one of the keywords in title. A site should have one
VIP content page per each of its main keywords, and then can also have Level
1 or Level 2 content pages (the ones belonging to the VIP that treat a related
topic) that will also carry the keyword of its VIP. To clarify this explanation, there
is an example: Surveybee.net offers paid online surveys, thus, one of their VIPs
is “Paid Surveys”, and the Level 1 content page belonging to “Paid Surveys” is
“Paid Surveys Guide”. Both, VIP and Level 1 contain the keyword in their title and
will contain the keyword within the content of each page with a density of 2 to 3
percent of the total length of the content page.
Moreover, it is crucial to have a page that talks about the company/site, norma-
lly called “About us” page. Here is the place to include the branded searches
keywords. Additionally, the anchor searches keyword will be included within the
22.214.171.124 How to write a content page
Back in the day, it was normal to create long content pages stuffed with keywords
and therefore trick the spiders to rank their site on the first page. Nowadays, with
the constant evolution of Google algorithms, it makes no sense to try this anymo-
re. Moreover, if we get to position our site on the first page but when a user enters
she does not like the content and leaves, our strategy did not succeed.
Therefore, it is really important to write mainly for users but keeping an eye on
what spiders will value.
Keeping the keyword in the title and alive during the content page is relevant,
but should not be done in an unnatural way. If the content page is talking about a
specific topic, the topic (keyword) will appear naturally along the text. Moreover,
it is not necessary to keep the keyword in its root form, spiders also recognise
synonyms and variations of it.
Regarding the length of the text, Google appreciates that sites give a good infor-
mation to its customers. Thus, a long text it is valuable by the spider. Nevertheless,
the text should be interesting and relevant from the bottom to the top, otherwise
users will exit the page and spiders will understand it as irrelevant content.
126.96.36.199 Create a Blog
Having a blog is the way of creating fresh content weekly. For Google spiders it is
important to see that the site is alive and brings to users new and unique content
regularly. There are many sites in the Web that are dead and Google wants to put
them aside. Having an updated blog we will show Google that we are alive and
will help us getting a good ranking position. Furthermore, that will add extra value
to the site by offering regular information to users.
How to write a blog post
All the given tips above about how to create a content page, are totally applicable
here. Nevertheless, what changes when writing a blog is the main topic of the
text. Content pages are there to explain users what do we offer. A blog is meant to
show users that we are aware of the latest trends of our field and we have always
the best information to offer them. Having a blog can also be a way to add extra
value to our product, giving to users little tips related to our product or service.
Within the blog post it is important to mention our keyword in form of New and
Trending Searches keywords, Long Tail or Micro Searches keywords.
The proper way of writing blog posts is by using only one keyword per blog. Thus,
we can use a different keyword in each blog post, giving specific value to each of
all our keywords.
Something to bear in mind when creating both content pages and blog posts, is
the writing style. Unlike the academic writing style that will start with the context,
background, details and finally go to the conclusion or main point of the text, the
journalistic style works in the other way around. The way journalist write is first
engaging readers and then keeping them reading until the end. Therefore, the
main topic will be on top of the text, and then follow up with details. (A Tale of Two
188.8.131.52 Having Google Panda Under consideration
When Google Panda update appeared in 2011, many websites were moved down
in their ranking positions due to their low quality content. In order to give some ad-
vice but not the key to success, Google Webmaster Central Blog offered a guide
for building high-quality sites.
21 Zomick, Brad. “Everything You Need To Know About The Inverted Pyramid Writing Style.” Http://www.skilledup.com. N.p., 20 May
2013. Web. 6 May 2016. <http://www.skilledup.com/articles/about-the-inverted-pyramid-writing-style>.
Some of the main tips that Amit Singhalt (Google Fellow) gives in the article are:
- Does the site has duplicate or redundant content using the same topics but
with slightly different keywords?
- Does the content has spelling, grammatical, stylistic or factual mistakes?
- Is the topic of the articles meant to be of interest of the user or only to guess
what might rank better in SEs?
- Does the site provide original content, original research, or original reporting?
- Is the information offered on the articles beyond obvious?
- Are the articles too short or unsubstantial?
- All your site pages has the same quality?
As mentioned before, those tips are only a friendly approach to what the algorithm
update can interpret from sites content. Following this advice might help, but as
mention in Amit Singhalt article “Rather than focusing on one particular algorith-
mic tweak, we encourage you to ask yourself the same sorts of questions we ask
when looking at the big picture. This way your site will be more likely to rank well
for the long-term.“
3.1.2 Web Design Optimization
When optimizing a site, it is crucial to pay attention to the design. It is true that
what brings users to a site is the product, service or content, and probably it is
the main reason that will make them stay longer. Nevertheless, if the web design
is confusing, annoying to look at or does not load immediately, it will make users
leave in few seconds, increasing the bounce rate and therefore the ranking posi-
tion in the SERPs.
Thus, a good design is definitely relevant for a site. But, it is not the beauty of the
site what will bring the success, but the usability of it. When a user arrive to one
of the landing pages, she needs to understand immediately what is going on, and
should be easily guided throughout the next steps until she converts (accomplish
the task we want her to perform).
Some of the main factors to take into consideration when optimizing a website
- Almost inexistent loading time
- Avoid side advertisement banners if possible.
- Clean design that is not confusing and guides the user through the site in a
catchy, trustworthy, and easy way.
- Readable typography that fits the site style and preferably on black over white
background (any color that is nice to look at in contrast with the background, will
Internal links are the ones that link from one section of your own site to another.
The reasons to link between our content pages and blog post are the following
- Helps the user to navigate along the site.
- Helps Google to define the architecture and hierarchy of the pages on a
- Distributes the same page rating and ranking power throughout the whole site
(if it is made properly).
Even though, the sophisticated algorithm does not shows a solid information
about how internal link architecture affects the site ranking position, Neil Patel
explains the base of the theory by saying: “Internal linking strengthens the ove-
rall search-optimized value of a website. Inner linking does so by providing clear
paths for spiders, prolonged sessions for users, and a tight-knit network of pages
and posts.” (24).
22 Singhalt, Amit. “More Guidance on Building High-quality Sites.” Official Google Webmaster Central Blog. N.p., 6 May 2011. Web.
22 May 2016. <https://webmasters.googleblog.com/2011/05/more-guidance-on-building-high-quality.html>.
23 Fishkin, Rand. “On-Page SEO in 2016: The 8 Principles for Success - Whiteboard Friday.” MOZ. N.p., 13 May 2016. Web. 20 May
24 Patel, Neil. “The Seven Commandments of Internal Linking That Will Improve Content Marketing SEO.” N.p., 14 June 2014. Web.
22 May 2016. <https://blog.kissmetrics.com/commandments-of-internal-linking/>.
In order to create the perfect internal link structure it is necessary to have the
following factors under consideration:
. Hierarchy: A site should have a few VIP pages, each of them containing the
main topics the site is treating. Since each page will treat one of the main topics,
the keyword of this specific page should be the topic itself. Moreover, in order to
add more value to the site, Level 1 pages can be created under the government of
the VIPs, meaning that each VIP page with its own topic, can have Level 1 pages
beneath that still contain the VIP keyword but treats a closely related topic.
- Anchor text: the way to link between pages should be by placing the link into
anchor text (clickable hypertext). This anchor text will be the keyword that gover-
ns the VIP page that we are linking to. Meaning that the Level 1 pages should
include at some point within the first 100 words, an anchor text with the VIP’s
keyword that will redirect from Level 1 to VIP. That will structure the content by
showing users and Google that all this content is related and therefore the site
offers detailed information about the topic.
- Blog: If the site has a blog, the internal linking hierarchy will work the same way
it works for Level 1 pages. Therefore, the blog should link to the VIP content pa-
ges but never the other way around.
- Amount of internal links: for articles between 500 word and 1000 word, one or
two internal links will fit perfectly. That will help both users and spiders navigate
184.108.40.206 Importance of Responsive Web Design (RWD)
In this new landscape where 65% of the digital media time is spent in mobile de-
vices, leaving the laptop as a secondary element, it is crucial to have Responsive
Web Design (RWD) under consideration.(25)
Implementing RWD means making websites adaptable to any kind of screen size
and environment of the device in order to maximize its usability. (26) Having this
fact under consideration will increase significantly the traffic to the site.
25 Sterling, Greg. “All Digital Growth Now Coming from Mobile Usage -- ComScore.” Marketing Land. N.p., 03 Apr. 2016.
Web. 2 May 2016. <http://marketingland.com/digital-growth-now-coming-mobile-usage-comscore-171505>.
26 Callan, Ylia. SEO 2016 - UX Design Guide. N.p.: n.p., 2016. Print.
3.2 Off-Site SEO
3.2.1 Linking Strategy
Getting external links means that other sites are referring to our site. At some
point along their content, they include a link to our site in form of a clickable text
(anchor text) or clickable image. This links coming from outside are called in-
bound links, and will be seen by Google as a vote from other website to ours and
will improve our position in the ranking result.
On the other hand, we can also link to other websites in order to give more infor-
mation to our users and therefore increase the value of our content. These kind of
external links are called outbound links.
a) Inbound Links
Getting this kind of external referrals is one of the main tasks an SEO should
achieve once the on-site SEO is accomplished. Bringing referral links from rele-
vant sites will give weightiness to our site, and thus improve our ranking position.
There are many ways of getting these inbound links. The best of the ways is to
earn them in a natural way, meaning that other sites will refer to us without us
taking any part on the procedure, only because our content is interesting enough
to make other sites refer to it.
However, getting links in the natural way is not as easy as it sounds because of
the huge amount of content existing in the Web already. Hence, it makes sense
to find other ways of making other sites link to ours. (27)
27 Person, and Rand Fishkin. “Targeted Link Building in 2016 - Whiteboard Friday.” Moz. N.p., n.d. Web. 16 May 2016. <https://moz.
The most common way is making one-to-one outreach vía email marketing. The
way to do it is finding other sites with related content, and find out why they would
be interested in linking to our site and then, do what could cover their need. Lin-
king out is not as easy as we will see in the next section. Thus, we should offer
the other sites a good reason to link to us. One of these reasons can be that we
have some good research that they are missing in one of their posts, and therefo-
re they could be interested in it or we could add extra value to their post by them
linking to us. Other reason could be that they are missing some kind of content, in
this case we could write a blog post for them and they will mention us in the blog
post as creators.
Some other reasons for them to link to us, can be that they are linking to a site
that is dead (broken link building) or is losing its domain rating, and they could be
more interested in change this link to our site instead.
There are many ways to make a link to our site worthwhile so we should find the
way by ourselves depending on the kind of content we are dealing with. But, are
all the sites worthwhile to get a link from?
Those are the different characteristics these targeted sites should accomplish:
Relevancy of the content between the targeted site and ours.
Amount of inbound links that the targeted site has (the higher, the better).
Popularity of the targeted site’s specific page that we get the link from.
The domain rating of the targeted site.
The anchor text used in the linking procedure (should be closely related with our
Once we find the targeted sites and the reason why they should link to us, it is
time to think about the way to approach them. They can be approached through
email, Twitter, Facebook Messenger or even a phone call. What matters is the
way we are going to sell them the significance of having a link to our site.
The best way of succeeding in this task is the execute, learn, and iterate way. The
more we try the more we learn. There is no specific way of doing so, because
there is no specific type of business, content or people to outreach. Therefore, we
should not be scared of trying many forms of contact until we succeed.
28 “External Links.” - SEO Best Practices. MOZ, n.d. Web. 16 May 2016. <https://moz.com/learn/seo/external-link>.
The other interesting and smart way of finding websites that could be interested
in linking to our site, is the competitor link building. It consist of taking a look at
who is linking to our competitors, what our competitor is offering them and trying
to improve the deal.
In order to know who is linking to our competitors we should first find our compe-
titors and then examine their inbound links.
First, we should type one of our keywords on Google search box and check who is
ranking above us within the SERPs. We should do this with each of our keywords
because we have different ranking positions depending on the keyword.
Once we know who ranks better than us, we can copy each competitor’s URL,
paste it in tools like Moz, Ahrefs or SEMrush, and find out who is linking to them
Only one time we gather all this information, we can make the outreach to the
sites linking to our competitors and offer them a reason to link to us.
Importance of Deep Linking
Normal SEO practice is to ask external sites to redirect the external link to our
home page, that should not be used in excess.
Getting inbound links only into the home page seems spammy for spiders and
only brings page rating to one of our pages and not to the whole site. Moreover,
when getting links only to the home page, we are ranking only in one of our
keywords, instead of doing so with all of them and have better chances to rank,
and therefore improve the site’s average position in SERPs. Last but not least,
when we receive links to our blogs, means that external sites are interested in our
content even if it is old. That gives Google a good signal of quality content.
b) Outbound Links
These kind of links are the ones coming from your site to other websites. It is
really important to pay attention to this kind of links and do not link out easily. The
only safe way of linking out is doing it to high authority sites with really relevant
content or statistics that we are missing and therefore give extra value to our
A big mistake is to link out to a site that is linking to one of our competitors, that
can be easy since we all belong to the same field of knowledge. Therefore, linking
only to governmental sites, statistic sites, and high authority sites (which are unli-
kely to link out to any of our direct competitors) is the safest practice. This kind of
links can increase our domain rating and also the quality of our content.
3.2.2 Significance of Social Media Presence
Content is king, but a king has no power if there is no one to follow him. So where
can we find all these followers together in a single platform?
Social media platforms are the new era of communication. These are huge social
networks where the majority of our audience is already engaged and therefore,
the best place to promote our site. Some of these social media sites already
offer paid advertisement for business promotion. Moreover, it is really easy to
find a specific target due to the different groups, categories, and interests that are
shown publically on these platforms.
(Figure 6) World’s key statistical indicators (29)
29 Chaffey, Dave. “Global Social Media Statistics Summary 2016.” Smart Insights. N.p., 21 Apr. 2016. Web. 6 May 2016. <http://www.
Figure 6 shows the incredible percentage of people using Internet, but even more
interesting, the big amount of people active in social media platforms.
In order to make these platforms a useful tool for promoting a site, it is necessary
to know which one have more users, which kind of target participate at them, and
which kind of promotion are we able to launch in there.
Right now, Facebook is the biggest social media platform in the world with more
than 1.59 billion active users and 1 million estimated business using it for promo-
tion. It is the best way to connect a site worldwide.
The second biggest network is Google Plus with 418 million active users, it’s fast
grow, thanks to the Gmail connection, made it a relevant tool to have presence at.
LinkedIn follows with 400 million active users, being the biggest professional
worldwide network. Really useful for connecting with related professionals or fin-
ding employers or employees.
Continuing with Twitter and it’s 320 million active users that do not mind to type
only 140 characters (recently able to share pictures and videos). The perfect tool
to interact with users.
As the biggest video social platform, YouTube reached the 1 billion visitors per
Not less important, the biggest visual social media platform with 400 million active
users is Instagram, followed by Pinterest with 100 million users. (30)
The social media platforms mentioned above, are the most successful and po-
pular ones, and also the more convenient ones for launch a site promotion. Ne-
vertheless, there are many more social networks that are growing steadily and
becoming part of this big worldwide network called Internet.
To conclude, an empiric opinion of a successful spanish SEO profesional, confir-
ms the importance of having presence in social media platforms. Marco Antonio,
professional SEO in Altamira Web (Spain) says that Facebook, Twitter, Linkedin
and Google Plus are now a source of traffic that businesses can not and should
not waste, because they are places where many of the potential customers spend
time reading content and interacting.
30 Maina, Antoni. “20 Popular Social Media Sites Right Now.” Http://smallbiztrends.com. N.p., 4 May 2016. Web. 6 May 2016. <http://
Marco Antonio also says that his own trials and a professional experiment (31)
shows that Google does not care exactly about the presence of a site in a so-
cial network and the amount of interactions happening at them, but it does care
about the shareability of the web content and the traffic it will bring to the web.
He recommends to use it as a tool to reach more users easily and make the web
site content more visible. If the content has high quality, there are more chances
to make it viral and arrive to more people than would arrive directly from the SE.
Indeed, the significance of website presence in social media platforms is that abs-
tract and relevant at the same time that I found it interesting to ask for more em-
pirical evidence to Jens Ulrik, partner at Overskrift, a digital company that offers
monitoring of Danish websites. Therefore, I thought his long and deep experience
in the field will give a good perspective at this matter. And to the question: until
what extent you think, it is relevant for a website to have presence in social me-
dia platforms? He answers: “I don’t see SERP ranking for a website, as a goal in
itself. The reason for having an online presence, be it a website or social media
engagement, will usually be to engage users, increase knowledge of your brand
and eventually convert leads to paying customers.(...) The end goal is to drive en-
gagement and lead conversion. One way is by writing attractive content on your
website or any channel that fits your brand and target group, but the way we work
with it at Overskrift, they are just channels - nothing more.” He also adds that the
SEO efforts and social media sharing efforts should be equal and should support
each other. (Appendix 2)
Therefore, having presence in social media platforms will connect our site with
more users and will increase the user’s awareness, this will bring more traffic and
therefore will affect secondarily the SERPs ranking.
A relevant factor to pay attention to, is that Facebook is the biggest Google Plus
competitor in what concerns to social media. Therefore, Google will never let Fa-
cebook take such a big part in its World Wide Web’s big game.
Moreover, if SEO techniques carried out in Facebook would actually affect the si-
te’s Google ranking position, it will mean that all SEOs will spend the most of their
efforts into improving their Facebook position instead of keep working on a quality
and trustworthy content that Google’s algorithm is fighting for so hard.
31 Chuiso. “¿Las Redes Sociales Posicionan? – Experimento SEO .” Http://chuiso.com. N.p., 25 Apr. 2016. Web. 8 May 2016.
3.2.3 Competitor Analysis
As in any other business, keep monitoring what your closest competitors do, it will
make a difference between appearing on Google’s first SERP or on the third one
instead. There are different ways and tools for taking a look into our competitor’s
websites and success reasons.
The first step is to know who our real competitors are. Here, it is important to take
into consideration that traffic and conversions are not the same. Accordingly, even
though a competitor is getting way more traffic, does not mean that our site is not
doing good as long as it keeps getting a bigger amount of conversions.
A great paid tool that helps with all the competitor analysis is SEMrush. On this
site, giving site’s URL as input, one can get detailed information about competi-
tors. But first step to be taken is to write our site’s URL and find a list of our com-
petitors. When having the competitors list, each of the competitor’s URLs can be
written on SEMrush and it will give really useful information about them, such as:
- Utilized keyword for their organic optimization
- Utilized keyword for their paid advertisement (in case they have it)
- Where their traffic come from
- Percentage of organic traffic
- Percentage of paid traffic
- Amount of branded search (how many users actually type their brand on the
- Amount of inbound links and quality of anchor text used for the reference (spe-
cifying which of those are follow or nofollow links)
- Their main competitors and where this site is positioned in between those com-
- The exact Google AdWords paid advertising that they are conducting at the
In order to reach our competitors’ position on SERPs, or even overtake them, it is
important to have under consideration their methods and consider to follow them
or improve them.
3.3 Track Results
3.3.1 Digital Listening
It has been shown previously that having presence in social media platforms is
beneficial for a site, but it can be even more useful.
Jens Ulrik Lange from Overskrift, presented his company’s work in a really in-
teresting session at school. What Overskrift does, is monitoring the Web and
“transform huge chunks of data into information and knowledge”(32). They offer
this service to websites that want to track closely everything that is going on in the
Internet related with their business.
Jens Ulrik explained us the importance of this interesting work, which they call
“digital listening”, and consists of keeping track of people’s comments in platforms
like Twitter, Facebook, Instagram, blogs, discussions forums, websites, Trustpilot
This monitoring will allow the business to know what users are talking about them,
the tone of voice they use, what competitors are doing and how users respond
to it, interact with users and therefore increase trust, measure result of different
launched campaigns, identify people’s discourse, new trends, and the most im-
portant, find negative conversations that need damage control.
This digital listening requires a meticulous and dedicated work in order to be awa-
re of every relevant affair going on in the Web.
32 “Social Listening På Dansk Overskrift - Monitorering Af Online Og Sociale Medier.” Overskrift. N.p., n.d. Web. 27 July 2016. <http://
Running a business nowadays requires having online presence, both for marke-
ting and selling reasons. If the business is a physical store, it is smart to increase
the product’s sales by starting an online store that will require a small investment
and could bring large benefits. Moreover, having online presence of a business
will increase the awareness and will bring more customers without spending big
amounts of money in advertisement. Having a good position of a website in Goo-
gle Serps will bring organic traffic and become the better and cheaper way of
Therefore, in this new digital scenario, where business are steadily gaining online
presence and users are increasingly interested in make their purchases online,
the chances of obtaining a relevant position within Google’s SERPs, and therefo-
re being easily found by our target, are quickly decreasing. Hence, knowing the
newest SEOs’ techniques and users’ trends in order to find our way to Google’s
SERPs’ top, is crucial for reaching success today.
Nevertheless, knowing the best SEO practices is not a matter of acquiring knowle-
dge at once as any other profession. In order to succeed on this job, it is neces-
sary to know how Google works, but also be aware of any new user trend or Goo-
gle update. Thus, this is a never-ending apprenticeship that should be combined
with redaction, marketing and user research skills in order to offer the best quality
website both for Google spiders and users.
In order to make this SEO best practices handbook I considered relevant to give
a better understanding about the interesting functionality of Google, and continue
by explaining its incessant updates that makes regarding the constantly changing
user behaviour. The fact that Google’s algorithm is every time closer to unders-
tand human seekings and becoming able to respond with the proper results, is
making this job a great and interesting challenge to accomplish.
1 Samsung Galaxy. “From Smoke Signals to Smartphones: The Evolution of Com-
munication.” Mashable. N.p., 05 Dec. 2014. Web. 10 Apr. 2016. <http://mashable.
2 “Dial-up Internet Access.” Wikipedia. Wikimedia Foundation, n.d. Web. 16 May
3 “Internet Live Stats - Internet Usage & Social Media Statistics.” Internet Live
Stats - Internet Usage & Social Media Statistics. N.p., n.d. Web. 24 May 2016.
4 “Global Search Engine Market Share 2016 | Statistic.” Statista. N.p., n.d. Web.
10 Apr. 2016. <http://www.statista.com/statistics/216573/worldwide-market-sha-
5 “Our History in Depth – Company – Google.” Our History in Depth – Company –
Google. N.p., n.d. Web. 22 May 2016. <https://www.google.com/about/company/
6 “The Anatomy of a Large-Scale Hypertextual Web Search Engine.” The Ana-
tomy of a Search Engine. N.p., n.d. Web. 22 May 2016. <http://infolab.stanford.
7 GoogleWebmasterHelp. “How Does Google Search Work?” YouTube. YouTu-
be, 23 Apr. 2012. Web. 10 Apr. 2016. <https://www.youtube.com/watch?v=KyCY-
8 Enge, Eric, Stephan M. Spencer, and Jessie Stricchiola. “2. Search Engine
Basics.” The Art of SEO: Mastering Search Engine Optimization. 3rd ed. N.p.:
O´Reilly, 2015. N. pag. Print.
9 “Algorithms – Inside Search – Google.” Algorithms – Inside Search – Google.
N.p., n.d. Web. 22 May 2016. <https://www.google.com/insidesearch/howsear-
10 Sullivan, Danny. “FAQ: All About The New Google RankBrain Algorithm.” N.p.,
27 Oct. 29015. Web. 20 Mar. 2016. <http%3A%2F%2Fsearchengineland.com%-
11 Sullivan, Danny. “FAQ: All About The New Google “Hummingbird” Algorithm.”-
Search Engine Land. N.p., 26 Sept. 2013. Web. 22 May 2016. <http://searchen-
12 “The Difference Between On-Site vs. Off-Site SEO -.” Inbound Now. N.p., 29
Oct. 2012. Web. 7 May 2016. <https://www.inboundnow.com/on-site-vs-off-site-
13 Jacob Nielsen. “F-Shaped Pattern For Reading Web Content.” Nielsen Nor-
man Group. N.p., 17 Apr. 2006. Web. 23 May 2016. <https://www.nngroup.com/
14 Ayaz Nanji. “Eye-Tracking Study: How Users View Google Search Result
Pages.” MarketingProfs. N.p., 6 Oct. 2014. Web. 24 May 2016. <http%3A%-
15 Alex Birkett. “F-Patterns No More: How People View Google & Bing Search
Results.” ConversionXL. N.p., 02 Mar. 2016. Web. 25 May 2016. <http://conver-
16 “Bounce Rate.” - Google Analytics Help. N.p., n.d. Web. 25 Aug. 2016. <ht-
17 Neil Patel. “How User Behavior Affects SEO.” Http://neilpatel.com. N.p., 18
Oct. 2015. Web. 6 Apr. 2016. <http://neilpatel.com/2015/10/18/the-advanced-gui-
18 “Google Organic CTR History.” Https://www.advancedwebranking.com. Ht-
tps://www.advancedwebranking.com/ctrstudy/, 5 June 2016. Web. 5 June 2016.
19 Fishkin, Rand. “A Step-by-Step Process for Discovering and Prioritizing the
Best Keywords - Whiteboard Friday.” Moz. N.p., 6 May 2016. Web. 25 May 2016.
20 McDonald, Jason. SEO Fitness Workbook: The Seven Steps to Search Engi-
ne Optimization Success on Google. N.p.:page 80 n.p., n.d. Print.
21 Zomick, Brad. “Everything You Need To Know About The Inverted Pyramid
Writing Style.” Http://www.skilledup.com. N.p., 20 May 2013. Web. 6 May 2016.
22 Singhalt, Amit. “More Guidance on Building High-quality Sites.” Official Google
Webmaster Central Blog. N.p., 6 May 2011. Web. 22 May 2016. <https://web-
23 Fishkin, Rand. “On-Page SEO in 2016: The 8 Principles for Success - White-
board Friday.” MOZ. N.p., 13 May 2016. Web. 20 May 2016. <https://moz.com/
24 Patel, Neil. “The Seven Commandments of Internal Linking That Will Improve
Content Marketing SEO.” N.p., 14 June 2014. Web. 22 May 2016. <https://blog.
25 Sterling, Greg. “All Digital Growth Now Coming from Mobile Usage -- ComSco-
re.” Marketing Land. N.p., 03 Apr. 2016. Web. 2 May 2016. <http://marketingland.
26 Callan, Ylia. SEO 2016 - UX Design Guide. N.p.: n.p., 2016. Print.
27 Person, and Rand Fishkin. “Targeted Link Building in 2016 - Whiteboard Fri-
day.” Moz. N.p., n.d. Web. 16 May 2016. <https://moz.com/blog/targeted-link-buil-
28 “External Links.” - SEO Best Practices. MOZ, n.d. Web. 16 May 2016. <ht-
29 Chaffey, Dave. “Global Social Media Statistics Summary 2016.” Smart Insi-
ghts. N.p., 21 Apr. 2016. Web. 6 May 2016. <http://www.smartinsights.com/so-
30 Maina,Antoni. “20 Popular Social Media Sites Right Now.” Http://smallbiztrends.
com. N.p., 4 May 2016. Web. 6 May 2016. <http://smallbiztrends.com/2016/05/
31 Chuiso. “¿Las Redes Sociales Posicionan? – Experimento SEO .” Http://
chuiso.com. N.p., 25 Apr. 2016. Web. 8 May 2016. <http://chuiso.com/redes-so-
32 “Social Listening På Dansk Overskrift - Monitorering Af Online Og Sociale Me-
dier.” Overskrift. N.p., n.d. Web. 27 July 2016. <http://www.overskrift.dk/products/
6. Glossary of Terms
SE: Search Engine
SEO: Search Engine Optimization. Also used for indicate SEO professionals.
Webmaster: SEO professional.
SERP: Search Engine Result Page
CTR: Click-Through Rate
Spider: Google’s crawler
URL: Uniform Resource Locator. It is an address that refers to a source on Inter-
VIP Content Page: Main pages in a website above the “Home page”. Normally
categories. They enclose the main content of the site.
Level 1 Pages: Subcategory pages. Level 1 page belongs under a VIP content
page and follows its same topic.
Level 2 Pages: Belongs under Leve 1 pages and treats related topic.
Anchor Text: It is hypertext, clickable text. Used for linking.
Inbound Link: It is a link that brings users from an external website.
Outbound Link: It is a link redirecting users to an external website.
Organic Traffic: Users arriving to a website in a natural way (no from and adver-
Organic Results: Are the non paid results that Google shows on its SERPS.
RWD: Responsibly Web Design, when a website is working properly in smaller
devices such as mobile phones and tablets.
Glossary of Terms
Parece que tiene un bloqueador de anuncios ejecutándose. Poniendo SlideShare en la lista blanca de su bloqueador de anuncios, está apoyando a nuestra comunidad de creadores de contenidos.
¿Odia los anuncios?
Hemos actualizado nuestra política de privacidad.
Hemos actualizado su política de privacidad para cumplir con las cambiantes normativas de privacidad internacionales y para ofrecerle información sobre las limitadas formas en las que utilizamos sus datos.
Puede leer los detalles a continuación. Al aceptar, usted acepta la política de privacidad actualizada.