The Trouble With SERP Tracking

SEO

The Trouble With SERP Tracking

SERP TrackingThere’s an ongoing obsession with tracking search engine result pages (SERPs). Both seasoned search marketing specialists and know-enough-to-be-dangerous webmasters can’t get enough of them. So what’s so special about these stats and why do people track them?

 

Why Track SERPs?

There are generally three reasons why people track SERPs: Research, Trends and Performance.

Research tracking allows an SEO specialist or webmaster to know where a website ranks based on a set of keywords. It’s one thing to know if a website is in a search engine’s index, but it’s another thing to know whether or not it shows up in the SERPs. Most people want to know whether or not it shows up in the top ten or twenty results, or if it resides deep in the SERPs where nobody will ever find it?

Trend tracking looks at the affect of an ongoing SEO campaign. That campaign can consist of simply updating the HTML code of a website or involve a sophisticated online link building campaign. Regardless, it’s important to know if the campaign is affecting a website positively or negatively in the SERPs for the target keywords.

Performance tracking relates to the actual traffic that the site receives from search engines. Instead of focusing on where a website resides in the SERPs for targeted keywords, performance tracking only focuses on keywords and traffic that actually send visitors to the website.

 

The Problem With SERP Tracking

There are major hurdles to SERP tracking. First off, search engines don’t like it. They see SERP Tracking as an attempt to game their system and they also don’t like the burden it puts on their systems (usage that in their mind should only be used for what they consider to be legitimate requests). There have been attempts in the past to provide APIs to allow these type of queries, but search engines like Google quickly determined that their API wasn’t being used for the purposes they wanted it to be used for, and subsequently discontinued it. There is of course a large interest in having access to a commercial API for their search engine, but Google has continually shown absolutely no interest in providing that service.

The next problem is accuracy versus overloading/triggering the beast. The most accurate results are those that would appear for most regular users — ten results per page. The problem is that if you want to get the top ten results for the first one hundred results, you have to hit the search engine ten times! The easiest way around that is to get the top one hundred results from one page (a setting that an easily be made in the search preferences or in the query request of the URL). That way you only have to hit the search engine once — there’s less impact on the search engine’s resources and you have all of the data you need. Unfortunately, those results aren’t entirely accurate.

Several years ago, search engines began to implement indented results — results to the same website that were bundled together, but the second result was indented. This mainly occurs when a search engine’s algorithm decides that there are two pages on a website that deserve to appear for the same keyword search. At first, this was rare, but as time passed it became more common. So common in fact, that if you do a top one hundred search for a popular term, almost every result will have a secondary indented result. If you didn’t pick on what I just said, let me be more clear; The top one hundred search results only have 50 websites!

Indented results occur in greater numbers for result pages that have more than ten results on them. For example, if you do a search that displays the first twenty results, and if a website shows up for the sixth result and also the nineteenth (normally page one and two), the nineteenth result will bubble up and indent itself underneath the sixth result — thus making the nineteenth result the seventh result and pushing everything else down. Now imagine that scenario on a one hundred result page where all of the result’s websites are repeated. The bubbling up of secondary results throws everything off compared to its ten results per page counterpart.

Finally, the elephant in the room is universal, subscription, local and custom search results (I guess that makes four elephants). All of these results can significantly change the search results — especially on the first page. Not only that, they can work in conjunction with each other and also occur randomly.

 

Future of SERP Tracking Is Passive

It’s only a matter of time when even the typical one through ten search result page will become a moving target. Once that occurs, it will be practically impossible to report any results that have any degree of accuracy. Coupled with search engines’ distaste for people tracking their SERPs, it’s also a matter of time until they implement measures that will make it extremely difficult to track SERPs en masse, or at the very least, affordably. Of course, this could be years away, but I do believe it’s coming.

Ultimately, I believe that the future of SERP tracking will be done passively. Passive SERP tracking involves using the referral data from search engine traffic to extrapolate not only the search engine and keyword, but also what page and/or position the result was on. We recently created a free Pepper (stats add-on) for Mint that captures passive SERPs in order to showcase this method SERP Tracking. There are also companies, like Enquisite, that are currently using passive SERP tracking to create and enhance their own analytics.

The beauty of passive SERP tracking is that it doesn’t require using any search engine resources, meaning, it’s search engine friendly. I recently had an email discussion with Matt Cutts about enhancing passive SERP tracking. He said:

We’ve talked about doing this, although we don’t have any plans right now; how would you propose that we pass along the rank information? I can see a lot of pros and cons to any particular approach.

We (Sitening) racked our brains on how to do this — it’s not as easy as one might think — and the best we could come up with was to either incorporate it into Google Analytics (which we believe Google could easily do) or to use Webmaster Tools by appending the links in search results. For example, if you could go to Webmaster Tools and turn on the ability to track the “rank” of a link by using an appended variable, that might work. So, instead of having the SERPs return http://domain.com/, they would return http://domain.com/?rank=4 (where ?rank=4 states the rank as 4). It would be an opt-in ability and if turned on, would affect all links that appear in the SERPs for that domain. The user would also have the ability to specify the characteristics of the appended variable in order to make sure it worked correctly with whatever technology they were using. Of course, it’s never that simple. If you already have a link in your index that fits that variable (?x=#), then Google may need to add an ampersand when it appends it — or they could just say too bad, don’t use crappy URLs if you’re going to opt in for the ranking service.

 

SERP Tracking Should Really Be About Performance

The only SERPs that really matter are the ones that bring traffic. Although it’s nice to know where a website resides in the SERPs, if you aren’t getting any traffic from it, it’s meaningless. Ideally, if you can connect and relate passive SERP data with ongoing campaign data, the analytics can become quite useful. For example, you can better correlate campaign efforts with increased traffic and conversions. You can also use that data to determine the effectiveness of different search marketing techniques.

For Raven, we’re going to continue to use traditional SERP tracking for monitoring trends, but not necessarily for accuracy. Although we would love to capture ten results per page instead of one hundred results, it’s cost prohibitive and it requires the use of too many resources from the search engines. Coupled with the problems discussed in this article — including the four elephants — we believe it’s only a matter of time until traditional SERP tracking results will become impossible to maintain any degree of accuracy.

Although traditional SERP tracking may always play a role in the Raven suite of SEO tools, we’re going to start focusing on how we can integrate SERP trends and performance with campaign data. We believe that’s where the most valuable information will ultimately be derived.

File under: SEO

Comments are closed on this post