Was that a Hummingbird or a Penguin that just shat on my site?
Google’s announcement of their new search algorithm, Hummingbird, has certainly received a lot of attention from the press, both in the SEO industry and in the general business press. Between looking at Mozcast.com data and Google’s own announcement of roughly when it started rolling out (about a month prior to the announcement), it seems to have made its appearance around August 20, 2013.
In a nuts-and-bolts overview of Hummingbird, Danny Sullivan of Search Engine Land comments that Google’s search chief Amit Singhal said that Hummingbird represents the first time a Google algorithm has been so dramatically rewritten since 2001. Sullivan compares the new algorithm to a new engine in a car, while comparing other recent changes (such as Panda and Penguin) to new parts in an old engine.
Despite this major change, most SEO experts I’ve spoken to haven’t seen major shifting of rankings for their clients. The general idea is that Hummingbird is less about matching pages to a couple of keywords searched for, and more about matching a sentence-like query to a page that appears to answer that question. In other words, it’s trying to do a better job at searches like “what’s the best time of year to go to Tahiti” vs. “Tahiti weather”. But honestly, as those happen to be terms I’ve been tracking ranking on for quite some time…I’m not seeing much change in the results on page 1.
Nevertheless, there seem to be plenty of website owners out there who, seeing their traffic fall off dramatically in late summer this year, and seeing all the talk about Hummingbird, put 2 and 2 together and conclude that they’ve been bitten by the itty bitty beak of Hummingbird.
Meanwhile, back at the ranch (or server farms), Google has been continuing its battle against spam – particularly in the area of backlinks. Penguin might be old boring news, but it’s important to remember that Penguin, like Panda, and Google’s PageRank calculations in general, continue to be updated with new data.
And where does that data come from? Well, certainly Googlebot’s crawling of new and changed pages. But also, Google is receiving an enormous amount of data on spammy linking from the disavow tool, in which webmasters who are being penalized for naughty linking practices tell Google which backlinks to their site are ones they KNOW are bird poop – essentially asking Google to ignore those backlinks both from a PageRank perspective and from the penalty perspective.
This is giving Google a ton of helpful information: if 1000 webmasters all tell Google that links from (www.buy-cheap-links-youll-never-get-caught-trust-us.com) are spam, that’s a pretty good indication to Google that they shouldn’t be counting links from that site to any of the other 100,000 sites that have links from it. And perhaps any site with links from there might be due for a little manual inspection from the spam team at Google.
Do we know that Google is using the data from the disavow files? Not for certain. But even if they aren’t yet using it today, surely it’s a gold mine that will be used in the near future.
In the meantime, Google continues to manually find big link networks and remove their influence on the results. And when this happens, all those links you had from sites in that network suddenly don’t count anymore. Your PageRank drops, and your rankings drop as well. Doesn’t mean you were penalized, even…you are just no longer getting the benefit of the link juice from a bunch of sites you really shouldn’t have benefited from in the past anyway.
Turning our attention to Penguin…
So now, let’s talk more specifically about Penguin. As of Penguin 2.0 (May 22, 2013), penalties may be assessed not just for the home page of a site, but also internal pages. And Penguin feeds on three kinds of fishy links: links from spammy places, too many links from one particular site, and too many links with non-brand, non-domain anchor text.
As time goes on, the data feeding into Penguin increases. Consider those webmasters who are busily building/buying craptastic links, and haven’t YET exceeded whatever thresholds Penguin has for links with the same non-brand anchor text, or percentage of links from PR0 (Page Rank zero) domains, or whatever their metrics are….all of a sudden, they’ve crossed over that threshold with the last 100 links they’ve placed. And now, as Penguin gobbles another smelly data update, BOOM…rankings gone, traffic gone.
So, just because it didn’t happen on the date of an algorithmic update for Penguin, many webmasters will incorrectly dismiss Penguin as the cause. As an example, I had a client this week who thought Hummingbird had caused his site traffic to plummet, but the research showed that it was actually Penguin catching up to his backlink profile and finally penalizing him.
So do you care whether it was Penguin, Panda, Hummingbird, or something else that caused your traffic to drop? Of course you do – without knowing the cause, you can’t know the cure. But with regular rolling data updates for both Panda and Penguin, you can’t always simply map the known Google algorithm change calendar against your analytics and spot the problem. This is definitely helpful information – especially if it DOES line up with the date of an algorithm change – but not conclusive either way.
Diagnose your SEO problem:
- Compare your traffic drop dates against the Google algorithm changes, either manually, or using the very cool add-in for Chrome from Chartelligence. And when you’re in Analytics, make sure to remove the “noise” and drill down to JUST organic search traffic, by selecting Traffic Sources -> Sources -> Search -> Organic first. Once you think you’ve spotted a problem, you can compare a month pre-drop to a month post-drop, and see which terms in particular you got hammered for. (When the % change in the report shows the little infinity symbol, you might be on to something.)
- Start with Moz’s Open Site Explorer, go to the anchor text page, and select Phrases, and select All pages on this root domain, and click Filter. Pay attention to the column Linking Root Domains Containing Anchor Text – if the anchor text isn’t your brand, your URL, or innocuous things like “click here” or “(img)” without anchor text, then you don’t want to see more than a few links with each anchor text phrase. You DON’T want to see something like this:
- Use LinkResearchTools‘ Link Detox report (starts at $129/month) – it does a great job of spotting links from questionable domains, based on some pretty smart analysis of the domains. You DON’T want to see something like this:
If you’ve seen a drop in traffic, it might be due to Hummingbird, but it’s also entirely possible that it’s the result of Penguin or Panda or some other part of Google’s algorithm, which is always dynamically changing. There’s no way to nail the cause with certainty. Your best course of action remains to create the quality content that people want to find; use metadata to help Google know exactly what that content is and help people find it; and keep watch on backlinks.
Michael Cottam is a SEO consultant and founder of Visual Itineraries, based in Portland, Oregon. He serves on the board of SEMpdx, and is a Moz Associate, answering questions in the Moz Q&A forum as well as occasionally posting to the Moz main blog.