Do you remember as a kid when you’d wake up after a nightmare, absolutely convinced that there was a monster in the closet, ready to eat you alive?
Well, SEO’s the world over have been feeling like that a lot lately. My buddies and I half-jokingly refer to Google as “The Google Monster” since you never know when it will pop out of the closet or from under your bed and eat your brains, and Matt Cutts isn’t popping in to say “It’s OK, it was just a bad dream, there aren’t really monsters in the closet.”
But perhaps I’m getting ahead of myself. First, a little bit of background.
As an SEO, my job is essentially to learn to view the web through Google’s eyes. To see what it sees, to know what it wants to see…for all intents and purposes, my job is to reverse engineer Google’s search ranking algorithm. Easier said than done.
In 2010, Google made over 500 “useful” changes to it’s ranking algorithm (you can find some details here.) That comes out to 1-2 changes a day, and there were far, far more algorithm change tests than that. My friends, that is a TON of flux.
When you hear about Google algorithm updates, you probably think of Caffeine, Panda or Penguin. If you’ve been doing SEO for a long time, you might think of Boston or Florida (early named updates, circa 2002/2003.)
In fact, SEOmoz keeps a running tally of major named updates here.
And while updates like Panda and Penguin have had a pretty significant impact, there are many, many other updates that fly under the radar, some of which actually cause more fluctuations in the SERPs than Panda or Penguin updates.
At MozCon last week Dr. Pete Meyers, President of UserEffect and an SEOmoz associate, announced a new tool that he’d been working on for many months, designed to track Google’s algorithm changes in very near real-time! Enter MozCast:
While not a perfect tool, it does a fantastic job of identifying when there is significant movement across 1,000 different keywords. While 1,000 keywords is a pretty small sample size, it is enough to show when there is unusual movement above and beyond a “normal” day (normal being pretty subjective, since Google is testing things constantly.) And of course, while one data source is great, two is even better, and SERPmetrics has been kind enough to develop a tool of their own:
While MozCast looks only at Google, SERPmetrics Flux looks at Google, Yahoo and Bing (and 100,000 keywords instead of just 1,000.) And with two separate data sources, we can now watch for correlation in movement patterns to better spot algorithm updates!
All I can say is, WOOOOHOOOOOO! While this doesn’t directly tell us what in the algorithm is changing, by giving us greater insight into when those changes are occurring and what sites are moving, we’re now one step closer to being able to spot and decipher algorithm changes in real-time. Hell yes!
There are plenty of people out there who will tell you not to chase the algorithm, and I agree…to a certain point. It’s far better to work to stay at least a year or two ahead of Google, defensive instead of offensive, and that’s what I work to do. However, “chasing the algorithm” can provide incredible insight into what’s coming a few months to a few years down the road.
I started blogging about creating an organic link profile in late 2010, a year and a half before Penguin kicked the crap out of over-optimized anchor text. How did I know it was coming? Keyword level penalties already existed, and it just made perfect sense that Google would lower the threshold as anchor text was being more and more abused.
So, the moral of the story is: Google is the boogeyman, and if you don’t pay close attention to what he’s eating now you won’t be prepared when his appetite changes down the road 🙂
If you’re looking for a great post that dives deep into chasing the algorithm, Web Gnomes has a great one.