Se ha denunciado esta presentación.
INTRO TO SEO IN 20 MINUTES
FOUNDER & CEO, INTELLIFLUENCE
WHO AM I?
Involved in search industry since
Founder & CEO of Intellifluence
Have managed thousands of
I’ve been fortunate enough to
play in all facets of online
marketing, providing a broad
The stated description of this talk is to “Learn everything
from keyword research to formatting content for featured
snippets so that you can learn how to optimize your website,
blog, and even your video streams on YouTube.”
I’ll try to save you some time with simple recommendations
on how to best accomplish this, but will focus more on what I
believe is important to understand on a SEO 101 level.
Styled after Johnny Carson’s rules for things to avoid for a
WHAT IS THE DESCRIPTION OF THIS TALK?
WHAT IS YOUR TAKEAWAY?
SEO can be perceived as overly complex as it is possible to deep
dive into various areas in a way that do require a significant
amount of research and experience to fully appreciate, however I
am a firm believer in focusing on a few core elements in order to
maximize your efforts in the even that you aren’t looking to
actually make a career out of SEO and instead want to use it as a
tool when running your business.
I bucket SEO into three areas: links, content, and user
Positive SEO under this broader view would be any tactic
performed with the intent to positively impact rankings for
a URL and possibly its host domain by manipulating a
variable within the links, content, or user signals buckets.
As an inverse then, negative SEO would be any tactic
performed with the intent to negatively impact rankings
for a URL and possibly its host domain by manipulating
a variable within the links, content, or user signal
SO LET’S TALK NEGATIVE SEO
WHAT ARE THE THREE BUCKETS?
Remember: Links, Content, and User Signals:
WHAT MIGHT YOU NEED?
1. A browser with access to Google
and Bing [content]
2. Access to your raw weblogs
[content & user signals]
3. Google Analytics [content &
4. Google Search Console
[content, links, & user signals]
5. Bing Webmaster Tools [content,
links, & user signals]
6. Ahrefs [links]
7. Sitebulb [content & user signals]
8. Copyscape [content]
You can use alternative tools when
evaluating links, content, and an
approximation of user signals; this is just
a snapshot of the current publicly
available tools if I were to dive into an
Every practitioner is also probably going
to have their own homegrown tools as
well, but it doesn't make sense to
mention those given their lack of broad
DISCLAIMER ON TOOLS
Bad news: there is no such thing as being
More bad news: there also is no such
thing as being negative SEO-proof.
All you can reasonably do is make the
appropriate efforts to lessen the
probability of becoming a victim by
reducing attack vectors, so that someone
seeking to do harm has to be more
sophisticated and put forth more effort
than they would against the average
HOW TO BE PROACTIVE AND PREVENT A
NEGATIVE SEO CAMPAIGN AGAINST YOU
Scrape scrape scrape
CONTENT (AND INFRASTRUCTURE) AS AN
What can your host do to keep you out of trouble? Quite a bit.
I debated placing hosting as a user signals vector given how important a proper setup is for
uptime and site speed, but there's another critical factor at play with this specific
If you were to address 100% of all the issues in this presentation, yet happen to be on a shared IP
with a dozen other domains which are flagged for distributing malware, are blocked by email
spam detection services, or are subject to manual link actions from Google, you're in for a bad
Bad neighborhoods are a thing.
You will at a minimum want to ensure you have a dedicated IP for this domain you care about,
but ideally have the site on its own dedicated server. The other advantage of not sharing a
server at your host is this becomes one fewer attack vector a bad actor could employ, not being
able to gain access to your hosting setup by compromising a less security-minded domain on
the same host.
Not all content management systems (CMS) are made equal.
Some will automatically auto-spawn tag pages, archive pages, and separate image pages when
you attempt to create a single page; some will automatically include a dofollow commenting
section on posts which screams "spam me!" to all my favorite spam tools.
Since the majority of the world's websites run on WordPress, it is worth speaking to with
specificity to by default: disable comments, noindex tag pages, noindex author archive pages,
and probably noindex category pages. Some will disagree, but in a Google that contains the
Panda algorithmic penalty, my focus is on attempting to index high value pages only, a hurdle
tag pages, archives, and category pages rarely clear.
Also with certain CMSes it is important to ensure proper canonicalization is used to keep
duplicated content from being indexed due to pagination and other query string nonsense.
Robots.txt manipulation is a double-edged sword, and not just because it is very common to
find a mistake which results in an entire domain being deindexed.
When crawling rules are too tight, it is possible to pretend a bad content result exists in a
somewhat expected path and have Google rank that nonexistent page on the basis of a
domain's inherent authority and keywords used in the URL slug alone, since Google is prevented
from actually crawling the page and therefore has to trust that it "might" exist.
One of the biggest risk reductions comes in the form of disallowing search pages from
becoming crawled and indexed. Without knowing which CMS you use, here's some generic
advice for you to pick and choose from:
Proper robots.txt setup isn't just for keeping poor quality pages out of the index.
When faced with crawl budgeting, it can also be important to preview pages and
permalink fail pages from the standpoint of ensuring Google doesn't waste time
getting caught in a spider trap. To do that in WordPress is relatively easy:
I'm not going to suggest that you take a stance on scraping content as a means to
You'll need to be proactive in using a content protection service to ensure that your
images and writing are not used elsewhere on the web without your authorization.
While Google is a bit better now at attribution of source material, there still does
exist issues with using authoritative domains as parasitic hosts, where the attacker
will purposefully seek to continuously crawl a target domain by sniffing a sitemap,
posting the new content on the parasitic host within seconds of it going live on the
Using a service to find these content thieves is a must. Depending on the where the
content exists, you may need to issue a takedown request, some of the addresses of
which I've compiled here: http://www.digitalheretix.com/blog/how-to-remove-
Outbound links via UGC
Outbound links injected
LINKS AS AN ATTACK VECTOR
As stated in the section on CMSes, I'm not a fan of open comments because they
seem to be more often abused than used correctly, but what about other sources of
UGC? If you add a community/forum section on your site for members to interact, I
recommend doing one of four things:
a. Nofollow all external links
b. Force all external links to redirect through an internal page to strip
outbound link equity
c. Noindex all threads
d. Pre-moderate all external links
Forums/communities that don't follow one of the above approaches likely appear
on several Xrumer and GSA lists.
OUTBOUND LINKS VIA UGC
This is a trickier issue to be proactive about,
because by definition you're being reactive.
However, monitoring Google Search Console
for outbound links and scoring those links
with a risk assessment tool of your choice
such as Link Research Tools is going to be the
best option here.
Another method involves a consistent
crawling script with multiple user agents
(Google and not Google) to determine if any
links or content exist that should not; this is
essentially handled by reverse engineering
cloaking software to attempt to decloak
OUTBOUND LINKS INJECTED
Inbound links are far more likely to be your problem. There's only a few things you
can reasonably expect to accomplish to try and protect yourself:
a. As an overall percentage of your incoming links, you want as many good
ones as possible, which means getting links for yourself is part of your
proactive strategy - Yes, I know, that's obvious; it's a lot like Google telling
you to just make great content. The truth isn't far from that though -- if you
are consistently focused on producing the best content assets as they pertain
to your niche and can do so in a way that your content answers a lot of
questions users of sites in your niche might have, you'll consistently earn links.
If you have only a few decent links and a bad actor decides to place a few
hundred thousand very bad links at you, Google will almost certainly treat you
unfavorably. The more uneconomical you can make that attack by increasing
your beneficial links, the better.
b. Watch your anchor text - One easy filter to trip is still the over-optimization of
anchor text, so even if you're attracting great links be sure to do so in a manner
that isn't forcing your audience into a narrow set of phrases you wish to rank for. If
you do see your anchor text starting to get too concentrated, be on the specific
lookout for a bad actor that's trying to slow trip the filter by ramping up a more
c. Disavow - I've gone on record that I don't like that disavow even exists, as I feel
it is indicative of a guilty until proven innocent environment within Google, but
since it does exist, you'll want to proactively disavow based on your risk scoring
solution. Remember, it is not just the overseas counterfeit porn and gambling
emporium links that you'll need to address, but also include those that appear to
be part of any nuanced attack as well.
INBOUND LINKS (CONT.)
There's really only a couple factors that can come into play for you
to be aware of, and one of them there isn't much you can do
USER SIGNALS AS AN ATTACK VECTOR
CTR, time on site, and bounce metrics are consistently being folded in over time as
more trusted signals by Google.
Knowing your baseline stats in Google Search Console and Google Analytics are
important here because it is nominally easy to hire a botnet and a few thousand
micro workers to click a result, bounce in 1 second, and have a portion of them file
a suggestion that the domain they visited wasn't a quality site.
All you can really hope to do is notice strange trends and then attempt to
compensate; if it is an obvious botnet, block it at the server or CDN level.
If it is a bunch of incentivized users, however, all you can really hope to do is handle
the situation like you would your inbound links, by aiming to provide a satisfactory
experience and acquiring traffic that you know will offset the poor metrics.
Earlier I alluded to including server setup as a speed
consideration for negative SEO; want to prevent a
potentially slow site being used against you? Don't host it
on a shaky setup.
If possible, consider using a CDN to protect yourself
somewhat from DDoS attacks and ensure that your own
server environment is up-to-date to prevent zero day issues
such as UDP amplification, Slowloris (RSnake's fault), and
other attacks (those are particularly nasty).
Beyond that, you'll want to look into any way an individual
could leech bandwidth off of you by locking down at the
server level on inline linking of your images, remove unused
CMS plugins, and establish proper caching.
Malware as a user signal? Absolutely, though you could
argue this is more of a content issue and I'll agree with you.
Nothing contributes to a poor experience quite like getting
To prevent such situations, in addition to keeping up-to-
date on your hosting environment and CMS security
updates, it is healthy to periodically run a malware scanner
on your site's server to seek it out and remove. The sooner
you can find problems, the better; thankfully, Google is
pretty forgiving to addressing known malware
compromises, but they don't catch all and will fold in poor
user data as normal usage when they miss it.
Let’s quickly give you the answers to the topical guide.
Want to properly use content formatted for featured snippets? Purchase either Yoast premium
or use Rankmath and follow the plugin steps for simple content optimization. There’s no need
to overthink it as the schema does change over time and you can let the plugin manage that.
Want to know what content to create in the first place? Perform a gap analysis in Excel by
collecting paid search data from SpyFU and link data from Ahrefs. By combining these two
sources you can get an understanding for what content is the most valuable and which content
is the least link supported – pick the low hanging fruit. Coming soon: KWjuicer.com will do all of
this research for you, and is owned by CopyPress, so they can also create the content for you.
Want to optimize your streams on YouTube? Make sure you provide a detailed title and
description w/ link to your site. Embed all your videos on your blog posts and get influencers to
share both the blog posts and your direct YouTube videos. I own the largest warm contact
influencer network in existence to help you. Enjoy a 60-day free trial on me by using the coupon
code AFFSUMMIT at Intellifluence.com.
OK, SO THAT WAS MAYBE MORE ABOUT
NEGATIVE SEO THAN AN INTRO TO SEO…
Remember, there’s very few things you really need to think about in most
cases within SEO 101…
Only create content that uniquely satisfies a user’s query; don’t create
content for the sake of doing so. Make it the best possible answer for that
Make sure that content is accessible to search engines and loads as quickly
as you can get it to load, with a very clear KPI of what you want the user to
do on that page.
Get traffic-bearing links to that content. Don’t obsess about nofollow;
obsess over whether that traffic source might convert on your stated page’s
For SEO 101, anyone trying to force you into doing something further is
probably just trying to sell something.
Icons courtesy of flaticon.com (Freepik, Icon Monk, Icon Pond,
mynamepong, pongsakornRed, Roundicons, Smashicons, Nikita
Golubev, photo3idea_studio, Good Ware, phatplus, Prosymbols,
Gregor Cresnar and Dave Gandy). All product names, logos, and
brands are property of their respective owners.