SlideShare una empresa de Scribd logo
1 de 7
Descargar para leer sin conexión
December 19th, 2011                                                                                                Published by: jeffox4d




Ultimate Guide to
Google's Panda Update
Introduction to Google's Panda Update
First off let me start by saying because of Google, I HATE pandas. I don’t even eat at Panda Express anymore for obvious reasons.
SEOs vs. Google’s Panda is a lot like snowmen vs. a volcano. But despite the lack of success stories, there is hope.
Since the first panda update back in February 2011 there have been many articles written about Panda and what webmasters
should and shouldn’t do. Some have been very insightful and others pretty much worthless. After reading nearly every panda
article published about the Panda update I would like to share with you what the general consensus on Panda is, as well as
some of my own personal findings. My goal here is to equip all the webmasters and SEOs of the world with the right weapons
to slay the panda.
So let’s get right to it. Grab some coffee and takes some notes because ladies and gentleman, it’s panda hunting season…

What is The Panda Update?
The name “Panda” comes from its creator Navneet Panda, a Google software engineer. Simply put, the Panda algorithm attempts
to put all websites into 1 of 2 categories: good or bad. Sounds easy, right? Not quite, there is a lot that goes into this simple task.
Essentially here is how Google does it:

  1. Google starts with a sample of many different websites with various levels of quality.
  2. Google’s quality raters look at the websites individually and place them into 1 of 5 categories: Vital, Useful, Relevant,
     Slightly Relevant, and Off-Topic/Useless.
  3. Using machine learning, Google looks at what metrics the bad websites have in common.
  4. Google applies these metrics to all websites on the web. Those that share the same metrics as the bad websites get penalized.
     This usually happens once every 4-6 weeks.

Even though the Panda update increased the quality of Google’s search results, many quality websites were negatively affected.
Anytime a computer is given the task to think like a human, many mistakes are going to be made.

Why Was Panda Created?
Panda was put in place to reduce spam in Google’s search results by specifically targeting “scraper” websites and other low
quality websites. A scraper website, sometimes known as an “autoblog,” is a blog which steals content from other websites and
publishes it automatically. Usually a scraper website will have ads or affiliate links in an attempt to make some quick cash.




                                                                                                                                       1
December 19th, 2011                                                                                           Published by: jeffox4d

These websites filled the web with thousands upon thousands of pages of duplicate content. Not only did this negatively impact
the search results, but occasionally the copied content would outrank the original source. As a result Google was receiving a lot
of pressure to clean up their search results and the Panda update was born.


Why was my Site Penalized by Panda?
Without working for Google no one knows for sure what will get a website pandalized (panda + penalized). But using correlation
studies along with some tips from Google we have a good idea of what the Panda is after. Below are possible factors that put
a website at risk of the Panda Penalty.

   • Duplicate content between websites on the web
   • Duplicate content within a website
   • Poor visitor interaction

Duplicate Content Between Your Websites and Others on the Web
In my personal experience having duplicate content between your website and other websites is the quickest way to getting
slapped with the Panda Penalty. This has been a problem for nearly every Panda affected website I have come across.
As mentioned above in the “Why Was Panda Created?” section, Panda was put in place to target websites who were stealing
content. So there is no surprise that having duplicate content will get your site penalized. Here are some examples of how this
 duplicate content can about about:

   • You take content from other websites and post it on your own site to get some extra ad revenue. Panda comes along and
     sees hundreds or even thousands of pages of duplicate content. You get a much deserved panda penalty.
   • You are the ideal webmaster and spend hours writing all unique content. Other sites take your content and post it on their
     website. Google doesn’t know who the original source is and just sees that you have the same content as a bunch of other
     websites. You get penalized (yes this does happen) and punch a hole in the wall.
   • You have an eCommerce website with hundreds or thousands of products. To save time, you (along with all the other
     eCommerce sites) use the manufacturer’s provided product description. Google sees that you have hundreds or even
     thousands of pages of duplicate content. Your website gets penalized and your sales plummet.

Duplicate Content Within Your Website
Duplicate content within a website can also get your site penalized if you are not careful. This type of duplicate content usually
comes in two different forms:

  1. Multiple pages with identical, or nearly identical content: All the content across your website should be unique. Copy and
     pasting content while only changing a few keywords can get you into trouble fast. To avoid a panda penalty write all unique
     content on all your pages.
  2. One page with multiple URLs: In Google’s eyes every different URL is a different page. This mean that to Google
     www.opencart.com and www.opencart.com/index.php are two different pages with duplicate content. Make sure all
     webpages only have 1 URL per page.

Eliminating duplicate content within your website is good in general, not just for Panda. Search engines have trouble deciding
which page to index and rank when duplicate content is present. Having duplicate content caused by multiple URLs can divide
link juice between the different versions of the page and lower rankings.

Poor Visitor Interaction
People usually don’t stay on a spammy website very long. In fact I would bet that the majority of visitors to spammy websites
leave almost immediately. Why am I telling you this? Think about it, if all spammy websites share a high bounce rate wouldn’t
Google want to factor this into their Panda algorithm? Yup! And that’s why many SEOs, including myself, believe that usage
metrics such as bounce rate can affect whether or not your website gets Pandalized. But having a high bounce rate alone isn’t
what Google cares about. Let me explain with two examples where the user bounces from the website…

   • Good User Path: Joe wants to know when the next Olypmics are. He goes to Google and searches “when is the next
     olympics”. He ends up at the Olympic Games page and quickly sees that the next Olympic Games are July 27 – August 12
     2012. Joe is satisfied with his result and leaves the page.



                                                                                                                                  2
December 19th, 2011                                                                                           Published by: jeffox4d

   • Bad User Path: Mike wants to buy Call of Duty: Modern Warfare 3. He goes to Google and searches “buy call of duty
     modern warfare 3″. He ends up on a page at www.buycallofduty.info and realizes that this ad filled page will not help me
     buy the game he desires. Mike goes back to the search results and clicks on a page from www.amazon.com. This time Mike
     is satisfied with his search result and goes on to buy the game.

The first example above shows that there are times when you can expect to have users bounce from your website. These are
usually times when a user is looking for specific information such as a phone number for a business, capital of a country, a food
recipe, etc.
On the other hand, the second example shows a user path that could indicate to Google that your website is crap. Sure a few of
these types of visits won’t hurt you, but if this occurs a regular basis you may be in some trouble.


How to Check if You Were Penalized
If you are already 100% positive you were mauled by the panda you can skip this section and move on, otherwise let me explain
how you can quickly figure out if you were affected. This step-by-step tutorial will tell you exactly how to diagnose your traffic
and figure out if you were bitten by the beast.

  1. Before we get started make sure you are using the new version of Google Analytics. If you are not sure what version you
     are using just refer to the following screenshots:



     Old Google Analytics




     New Google Analytics
  2. We need to setup an advanced segment to filter out all the irrelevant traffic such as direct traffic, referral traffic, Bing
     traffic, etc. If you are new to analytics don’t worry, this is easier than it sounds. Start off by opening up the “Advanced
     Segments” drop down menu at the top and selection “+ New Custom Segment”.




  3. Once inside the “Advanced Segments” drop down, we need to name the segment and start adding the necessary filters, aka
     “dimensions”. When all filters are properly added your screen should look something like this (click to enlarge):




     The first filter ensures we are only receiving traffic from Google but since this could still contain Google AdWords traffic
     we need to add another filter. The second filter makes sure all traffic is coming from organic search traffic, aka not pay-
     per-click. Then we need to filter out any branded keywords, these are usually keywords in your domain name. Finally we
     are ready to save our Advanced Segment and start looking at the traffic.
  4. Now we want to see all of the traffic history from yesterday to just before the first Panda update. With the new version of
     Google Analytics set the date range from 2/1/11 to yesterday’s date.



                                                                                                                                  3
December 19th, 2011                                                                                          Published by: jeffox4d




  5. Look for any noticeable drops in traffic and compare them to the dates of known panda updates. A typical drop in traffic
     from Panda is usually at least 25% and happens overnight.




(click to see larger image)


Here is a list of all Panda updates thus far:

   • February 23, 2011
   • April 11, 2011
   • May 9, 2011
   • June 16, 2011
   • July 23, 2011
   • August 12, 2011
   • September 28, 2011
   • October 3, 2011
   • October 13, 2011
   • November 18, 2011
   • ~Early January, 2012

The example above shows a particular website that was hit by panda on June 16, 2011. The sudden drop in traffic is typical of
a Panda penalty. If you have a significant traffic drop (>20%) on the same date as a Panda update, you were likely penalized.
But don’t worry! In part 4 of this Panda series I will explain exactly what steps you can take to reduce and hopefully eliminate
the panda penalty.


Prevent and Reverse the Panda Penalty
Many times people think they got hit by the Panda update when they actually received a penalty for spammy links. Positive that
Google’s Panda mauled your traffic? Great! Let’s jump in and see what we can do to tame the beast.
Basically there are a few factors that result in a website getting slapped with the Panda penalty:

   • Duplicate content between your website and other websites on the web
   • Duplicate content within your website
   • Poor visitor interaction

Fixing each of these issues is easier said than done. But don’t worry, I’m going to hold your hand every step of the way and
together we can slay the Panda.

Remove Duplicate Content & Write Unique Content
First things first, make sure all the content on your website is unique. Copying and pasting content is the fastest way to get
pandalized. I hate using general tips like “write unique content” because it is so vague, but for many websites this is the only
option.



                                                                                                                                 4
December 19th, 2011                                                                                            Published by: jeffox4d

  1. Go to CopyScape.com and check for duplicate content – CopyScape is an amazing tool that searches the web for duplicate
     content. With the free version you can only check 1 URL at a time. If you have a larger site and larger wallet than I would
     recommend checking a batch of URLs with Copyscape Premium. Keep in mind that checking for duplicate content in
     batches will cost you $0.05 per URL
  2. Noindex or rel=canonical all pages with duplicate content – Every page that CopyScape identifies as being a duplicate to
     other pages on the web needs to go. There are many ways to do this but the 2 easiest ways are noindexing the page or
     using the rel=canonical tag. The noindex code tells search engines not to index the page. The rel=canonical tag tells Google
     that the page is a duplicate of another page. A general rule of thumb is to canonical pages with backlinks and noindex the
     others. Why? Because the canonical tag passes link juice to whatever page you point it to. This may be a bit confusing, but
     I have gone ahead and created 2 examples to visualize the process.




Both the noindex and rel=canonical code are placed in the <head> of your source code. Once these tags are added to your
duplicate pages it will take anywhere from 1 day to 1 month before Google re-crawls the pages and removes them from the index.
The time it takes for Google to re-crawl these pages depends on the size of your website and the number of backlinks. Once this
is done it’s time to sit back and wait for Google to re-run the Panda algorithm. This occurs about once every 4-8 weeks; the next
Panda update should occur sometime in early January 2012.

Implement the Rel=Author Tag to Your Articles
I would highly recommend implementing the rel=author tag if your articles are getting taken from scraper sites. The rel=author
tag is a fairly new piece of html code that Google is now reading. It tells Google who is the original source of an article and
sometimes adds a portrait of the author to the search engine results page.




Example of Rel Author in Search Results
Matt Cutt’s has hinted that using rel=author is, or could be a ranking factor if you are the source of an article. And as of today,
Google announced that they will be providing author stats in Google Webmaster Tools. These are 2 big indicators that rel=author
is already being used as a signal by Google and could be even more of a ranking factor in the future.
In order to utilize the rel=author code you must have a Google+ account. Google uses your Google+ account to confirm that the
article is associated with the author and pull the portrait thumbnail into the SERPs. Google has provided some easy to follow
steps on implementing rel=author, but essentially this is how it works:




                                                                                                                                   5
December 19th, 2011                                                                                              Published by: jeffox4d




Not only will rel=author tell Google that you are the original author of the content, but the profile picture in your search result
will increase the number of clicks to your website. You can’t go wrong!

Clean up Internal Duplicate Content
Now that we have addressed duplicate content between different websites, it’s time to focus on internal duplicate content issues.
The reason Panda cracked down on internal duplicate content is because many spammy websites were automatically creating
100′s or 1000′s of duplicate pages while only changing a few keywords. This would allow them to target many keywords with
very little effort (emphasis on would).
So what does this mean for you? It’s quite simple, any duplicate content within your website has gotta go. There are 2 main
reasons why a website would have internal duplicate content:

  1. You are automatically creating 100′s or 1000′s of duplicate pages to target more keywords
  2. You have pages with multiple URLs

If you belong to the first group and are automatically creating 100′s or 1000′s of pages with essentially duplicate content it’s time
to stop. Unless you want to get hit by Panda, you need to either rewrite all unique content or noindex/canonical the duplicate
pages.
Having multiple URLs for a single page is a common issue amongst most websites. This can be a problem because in Google’s
eyes, every separate URL is a separate page. Which means that to Google and other search engines, www.opencart.com and
www.opencart.com/index.php are different pages.
Here is a more visual example of a duplicate URL issue and what the canonical code would look like:




As you can see, the mikesbikeshop.com/red-bikes page is adding on the ?sort=A-Z parameter whenever someone sorts the page
alphabetically. This is causing multiple URLs for the same page as well as duplicate content. Mike should put the canonical tag
on the red bikes page but also on every page of his website as an SEO best practice.
If you are using WordPress then I would recommend installing the All in One SEO Pack because it will do all this for you,
otherwise you will need to do this page by page. If you have a large website, hire a programmer to automatically do this for you.

Lower Your Bounce Rate
Google knows when someone searches a keyword, clicks on your site, and immediately leaves your website to click on another
search result. It is likely that these types of interactions can signal to Google that your website is low quality and may be a good
fit for the panda penalty. There are a few ways you can lower your bounce rate:

   • Have good and relevant content that the searcher is looking for.
   • Add a video. Videos can significantly lower the bounce rate of a page and increase the average time on site.
   • Don’t target irrelevant keywords just because they have a high number of monthly searches.
   • Add a clear call to action to get your visitors to go to another page.

                                                                                                                                     6
December 19th, 2011                                                                                           Published by: jeffox4d

These tips will help lower your bounce rate and reduce your risk of being penalized by Panda. But at the end of the day if users
are consistently bouncing from your pages it may be a sign of a deeper issue. Put yourself in the searcher’s shoes and figure out
what they would want to see for any given search query. If you can do that then you are one step closer to being a successful
internet marketer.

Conclusion
Whether for better or worse, Google’s panda update forever changed SEO. The update determines the quality level of any given
website by looking at duplicate content and usage metrics. Sometimes Google gets it right, other times not so much. Either way,
this update is just another way of Google pushing webmasters to provide higher quality content.




                                                                                                                                  7

Más contenido relacionado

Último

Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 

Último (20)

Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 

Destacado

2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by HubspotMarius Sescu
 
Everything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTEverything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTExpeed Software
 
Product Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsProduct Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsPixeldarts
 
How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthThinkNow
 
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfAI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfmarketingartwork
 
PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024Neil Kimberley
 
Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)contently
 
How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024Albert Qian
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsKurio // The Social Media Age(ncy)
 
Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Search Engine Journal
 
5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summarySpeakerHub
 
ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd Clark Boyd
 
Getting into the tech field. what next
Getting into the tech field. what next Getting into the tech field. what next
Getting into the tech field. what next Tessa Mero
 
Google's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentGoogle's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentLily Ray
 
Time Management & Productivity - Best Practices
Time Management & Productivity -  Best PracticesTime Management & Productivity -  Best Practices
Time Management & Productivity - Best PracticesVit Horky
 
The six step guide to practical project management
The six step guide to practical project managementThe six step guide to practical project management
The six step guide to practical project managementMindGenius
 
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...RachelPearson36
 

Destacado (20)

2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot
 
Everything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTEverything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPT
 
Product Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsProduct Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage Engineerings
 
How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental Health
 
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfAI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
 
Skeleton Culture Code
Skeleton Culture CodeSkeleton Culture Code
Skeleton Culture Code
 
PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024
 
Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)
 
How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie Insights
 
Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024
 
5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary
 
ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd
 
Getting into the tech field. what next
Getting into the tech field. what next Getting into the tech field. what next
Getting into the tech field. what next
 
Google's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentGoogle's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search Intent
 
How to have difficult conversations
How to have difficult conversations How to have difficult conversations
How to have difficult conversations
 
Introduction to Data Science
Introduction to Data ScienceIntroduction to Data Science
Introduction to Data Science
 
Time Management & Productivity - Best Practices
Time Management & Productivity -  Best PracticesTime Management & Productivity -  Best Practices
Time Management & Productivity - Best Practices
 
The six step guide to practical project management
The six step guide to practical project managementThe six step guide to practical project management
The six step guide to practical project management
 
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
 

Ultimate guide to google's panda update

  • 1. December 19th, 2011 Published by: jeffox4d Ultimate Guide to Google's Panda Update Introduction to Google's Panda Update First off let me start by saying because of Google, I HATE pandas. I don’t even eat at Panda Express anymore for obvious reasons. SEOs vs. Google’s Panda is a lot like snowmen vs. a volcano. But despite the lack of success stories, there is hope. Since the first panda update back in February 2011 there have been many articles written about Panda and what webmasters should and shouldn’t do. Some have been very insightful and others pretty much worthless. After reading nearly every panda article published about the Panda update I would like to share with you what the general consensus on Panda is, as well as some of my own personal findings. My goal here is to equip all the webmasters and SEOs of the world with the right weapons to slay the panda. So let’s get right to it. Grab some coffee and takes some notes because ladies and gentleman, it’s panda hunting season… What is The Panda Update? The name “Panda” comes from its creator Navneet Panda, a Google software engineer. Simply put, the Panda algorithm attempts to put all websites into 1 of 2 categories: good or bad. Sounds easy, right? Not quite, there is a lot that goes into this simple task. Essentially here is how Google does it: 1. Google starts with a sample of many different websites with various levels of quality. 2. Google’s quality raters look at the websites individually and place them into 1 of 5 categories: Vital, Useful, Relevant, Slightly Relevant, and Off-Topic/Useless. 3. Using machine learning, Google looks at what metrics the bad websites have in common. 4. Google applies these metrics to all websites on the web. Those that share the same metrics as the bad websites get penalized. This usually happens once every 4-6 weeks. Even though the Panda update increased the quality of Google’s search results, many quality websites were negatively affected. Anytime a computer is given the task to think like a human, many mistakes are going to be made. Why Was Panda Created? Panda was put in place to reduce spam in Google’s search results by specifically targeting “scraper” websites and other low quality websites. A scraper website, sometimes known as an “autoblog,” is a blog which steals content from other websites and publishes it automatically. Usually a scraper website will have ads or affiliate links in an attempt to make some quick cash. 1
  • 2. December 19th, 2011 Published by: jeffox4d These websites filled the web with thousands upon thousands of pages of duplicate content. Not only did this negatively impact the search results, but occasionally the copied content would outrank the original source. As a result Google was receiving a lot of pressure to clean up their search results and the Panda update was born. Why was my Site Penalized by Panda? Without working for Google no one knows for sure what will get a website pandalized (panda + penalized). But using correlation studies along with some tips from Google we have a good idea of what the Panda is after. Below are possible factors that put a website at risk of the Panda Penalty. • Duplicate content between websites on the web • Duplicate content within a website • Poor visitor interaction Duplicate Content Between Your Websites and Others on the Web In my personal experience having duplicate content between your website and other websites is the quickest way to getting slapped with the Panda Penalty. This has been a problem for nearly every Panda affected website I have come across. As mentioned above in the “Why Was Panda Created?” section, Panda was put in place to target websites who were stealing content. So there is no surprise that having duplicate content will get your site penalized. Here are some examples of how this duplicate content can about about: • You take content from other websites and post it on your own site to get some extra ad revenue. Panda comes along and sees hundreds or even thousands of pages of duplicate content. You get a much deserved panda penalty. • You are the ideal webmaster and spend hours writing all unique content. Other sites take your content and post it on their website. Google doesn’t know who the original source is and just sees that you have the same content as a bunch of other websites. You get penalized (yes this does happen) and punch a hole in the wall. • You have an eCommerce website with hundreds or thousands of products. To save time, you (along with all the other eCommerce sites) use the manufacturer’s provided product description. Google sees that you have hundreds or even thousands of pages of duplicate content. Your website gets penalized and your sales plummet. Duplicate Content Within Your Website Duplicate content within a website can also get your site penalized if you are not careful. This type of duplicate content usually comes in two different forms: 1. Multiple pages with identical, or nearly identical content: All the content across your website should be unique. Copy and pasting content while only changing a few keywords can get you into trouble fast. To avoid a panda penalty write all unique content on all your pages. 2. One page with multiple URLs: In Google’s eyes every different URL is a different page. This mean that to Google www.opencart.com and www.opencart.com/index.php are two different pages with duplicate content. Make sure all webpages only have 1 URL per page. Eliminating duplicate content within your website is good in general, not just for Panda. Search engines have trouble deciding which page to index and rank when duplicate content is present. Having duplicate content caused by multiple URLs can divide link juice between the different versions of the page and lower rankings. Poor Visitor Interaction People usually don’t stay on a spammy website very long. In fact I would bet that the majority of visitors to spammy websites leave almost immediately. Why am I telling you this? Think about it, if all spammy websites share a high bounce rate wouldn’t Google want to factor this into their Panda algorithm? Yup! And that’s why many SEOs, including myself, believe that usage metrics such as bounce rate can affect whether or not your website gets Pandalized. But having a high bounce rate alone isn’t what Google cares about. Let me explain with two examples where the user bounces from the website… • Good User Path: Joe wants to know when the next Olypmics are. He goes to Google and searches “when is the next olympics”. He ends up at the Olympic Games page and quickly sees that the next Olympic Games are July 27 – August 12 2012. Joe is satisfied with his result and leaves the page. 2
  • 3. December 19th, 2011 Published by: jeffox4d • Bad User Path: Mike wants to buy Call of Duty: Modern Warfare 3. He goes to Google and searches “buy call of duty modern warfare 3″. He ends up on a page at www.buycallofduty.info and realizes that this ad filled page will not help me buy the game he desires. Mike goes back to the search results and clicks on a page from www.amazon.com. This time Mike is satisfied with his search result and goes on to buy the game. The first example above shows that there are times when you can expect to have users bounce from your website. These are usually times when a user is looking for specific information such as a phone number for a business, capital of a country, a food recipe, etc. On the other hand, the second example shows a user path that could indicate to Google that your website is crap. Sure a few of these types of visits won’t hurt you, but if this occurs a regular basis you may be in some trouble. How to Check if You Were Penalized If you are already 100% positive you were mauled by the panda you can skip this section and move on, otherwise let me explain how you can quickly figure out if you were affected. This step-by-step tutorial will tell you exactly how to diagnose your traffic and figure out if you were bitten by the beast. 1. Before we get started make sure you are using the new version of Google Analytics. If you are not sure what version you are using just refer to the following screenshots: Old Google Analytics New Google Analytics 2. We need to setup an advanced segment to filter out all the irrelevant traffic such as direct traffic, referral traffic, Bing traffic, etc. If you are new to analytics don’t worry, this is easier than it sounds. Start off by opening up the “Advanced Segments” drop down menu at the top and selection “+ New Custom Segment”. 3. Once inside the “Advanced Segments” drop down, we need to name the segment and start adding the necessary filters, aka “dimensions”. When all filters are properly added your screen should look something like this (click to enlarge): The first filter ensures we are only receiving traffic from Google but since this could still contain Google AdWords traffic we need to add another filter. The second filter makes sure all traffic is coming from organic search traffic, aka not pay- per-click. Then we need to filter out any branded keywords, these are usually keywords in your domain name. Finally we are ready to save our Advanced Segment and start looking at the traffic. 4. Now we want to see all of the traffic history from yesterday to just before the first Panda update. With the new version of Google Analytics set the date range from 2/1/11 to yesterday’s date. 3
  • 4. December 19th, 2011 Published by: jeffox4d 5. Look for any noticeable drops in traffic and compare them to the dates of known panda updates. A typical drop in traffic from Panda is usually at least 25% and happens overnight. (click to see larger image) Here is a list of all Panda updates thus far: • February 23, 2011 • April 11, 2011 • May 9, 2011 • June 16, 2011 • July 23, 2011 • August 12, 2011 • September 28, 2011 • October 3, 2011 • October 13, 2011 • November 18, 2011 • ~Early January, 2012 The example above shows a particular website that was hit by panda on June 16, 2011. The sudden drop in traffic is typical of a Panda penalty. If you have a significant traffic drop (>20%) on the same date as a Panda update, you were likely penalized. But don’t worry! In part 4 of this Panda series I will explain exactly what steps you can take to reduce and hopefully eliminate the panda penalty. Prevent and Reverse the Panda Penalty Many times people think they got hit by the Panda update when they actually received a penalty for spammy links. Positive that Google’s Panda mauled your traffic? Great! Let’s jump in and see what we can do to tame the beast. Basically there are a few factors that result in a website getting slapped with the Panda penalty: • Duplicate content between your website and other websites on the web • Duplicate content within your website • Poor visitor interaction Fixing each of these issues is easier said than done. But don’t worry, I’m going to hold your hand every step of the way and together we can slay the Panda. Remove Duplicate Content & Write Unique Content First things first, make sure all the content on your website is unique. Copying and pasting content is the fastest way to get pandalized. I hate using general tips like “write unique content” because it is so vague, but for many websites this is the only option. 4
  • 5. December 19th, 2011 Published by: jeffox4d 1. Go to CopyScape.com and check for duplicate content – CopyScape is an amazing tool that searches the web for duplicate content. With the free version you can only check 1 URL at a time. If you have a larger site and larger wallet than I would recommend checking a batch of URLs with Copyscape Premium. Keep in mind that checking for duplicate content in batches will cost you $0.05 per URL 2. Noindex or rel=canonical all pages with duplicate content – Every page that CopyScape identifies as being a duplicate to other pages on the web needs to go. There are many ways to do this but the 2 easiest ways are noindexing the page or using the rel=canonical tag. The noindex code tells search engines not to index the page. The rel=canonical tag tells Google that the page is a duplicate of another page. A general rule of thumb is to canonical pages with backlinks and noindex the others. Why? Because the canonical tag passes link juice to whatever page you point it to. This may be a bit confusing, but I have gone ahead and created 2 examples to visualize the process. Both the noindex and rel=canonical code are placed in the <head> of your source code. Once these tags are added to your duplicate pages it will take anywhere from 1 day to 1 month before Google re-crawls the pages and removes them from the index. The time it takes for Google to re-crawl these pages depends on the size of your website and the number of backlinks. Once this is done it’s time to sit back and wait for Google to re-run the Panda algorithm. This occurs about once every 4-8 weeks; the next Panda update should occur sometime in early January 2012. Implement the Rel=Author Tag to Your Articles I would highly recommend implementing the rel=author tag if your articles are getting taken from scraper sites. The rel=author tag is a fairly new piece of html code that Google is now reading. It tells Google who is the original source of an article and sometimes adds a portrait of the author to the search engine results page. Example of Rel Author in Search Results Matt Cutt’s has hinted that using rel=author is, or could be a ranking factor if you are the source of an article. And as of today, Google announced that they will be providing author stats in Google Webmaster Tools. These are 2 big indicators that rel=author is already being used as a signal by Google and could be even more of a ranking factor in the future. In order to utilize the rel=author code you must have a Google+ account. Google uses your Google+ account to confirm that the article is associated with the author and pull the portrait thumbnail into the SERPs. Google has provided some easy to follow steps on implementing rel=author, but essentially this is how it works: 5
  • 6. December 19th, 2011 Published by: jeffox4d Not only will rel=author tell Google that you are the original author of the content, but the profile picture in your search result will increase the number of clicks to your website. You can’t go wrong! Clean up Internal Duplicate Content Now that we have addressed duplicate content between different websites, it’s time to focus on internal duplicate content issues. The reason Panda cracked down on internal duplicate content is because many spammy websites were automatically creating 100′s or 1000′s of duplicate pages while only changing a few keywords. This would allow them to target many keywords with very little effort (emphasis on would). So what does this mean for you? It’s quite simple, any duplicate content within your website has gotta go. There are 2 main reasons why a website would have internal duplicate content: 1. You are automatically creating 100′s or 1000′s of duplicate pages to target more keywords 2. You have pages with multiple URLs If you belong to the first group and are automatically creating 100′s or 1000′s of pages with essentially duplicate content it’s time to stop. Unless you want to get hit by Panda, you need to either rewrite all unique content or noindex/canonical the duplicate pages. Having multiple URLs for a single page is a common issue amongst most websites. This can be a problem because in Google’s eyes, every separate URL is a separate page. Which means that to Google and other search engines, www.opencart.com and www.opencart.com/index.php are different pages. Here is a more visual example of a duplicate URL issue and what the canonical code would look like: As you can see, the mikesbikeshop.com/red-bikes page is adding on the ?sort=A-Z parameter whenever someone sorts the page alphabetically. This is causing multiple URLs for the same page as well as duplicate content. Mike should put the canonical tag on the red bikes page but also on every page of his website as an SEO best practice. If you are using WordPress then I would recommend installing the All in One SEO Pack because it will do all this for you, otherwise you will need to do this page by page. If you have a large website, hire a programmer to automatically do this for you. Lower Your Bounce Rate Google knows when someone searches a keyword, clicks on your site, and immediately leaves your website to click on another search result. It is likely that these types of interactions can signal to Google that your website is low quality and may be a good fit for the panda penalty. There are a few ways you can lower your bounce rate: • Have good and relevant content that the searcher is looking for. • Add a video. Videos can significantly lower the bounce rate of a page and increase the average time on site. • Don’t target irrelevant keywords just because they have a high number of monthly searches. • Add a clear call to action to get your visitors to go to another page. 6
  • 7. December 19th, 2011 Published by: jeffox4d These tips will help lower your bounce rate and reduce your risk of being penalized by Panda. But at the end of the day if users are consistently bouncing from your pages it may be a sign of a deeper issue. Put yourself in the searcher’s shoes and figure out what they would want to see for any given search query. If you can do that then you are one step closer to being a successful internet marketer. Conclusion Whether for better or worse, Google’s panda update forever changed SEO. The update determines the quality level of any given website by looking at duplicate content and usage metrics. Sometimes Google gets it right, other times not so much. Either way, this update is just another way of Google pushing webmasters to provide higher quality content. 7