Long Tail Blogging/HitTail Blog Response

A response to a HitTail blog post by Mike Levin:


You make a good point when you say that although Google now controls the distribution of information, it does so in a way that the results aren’t similar to “finite space” on a shelf store. The results returned and their number is in proportion to the number of potential searches and unique searchers out there. Obviously there are many people typing the same thing daily. These high frequency search terms are tough to get recognition for. But the distribution is skewed.

Mike, here’s a thought, why not run some analysis…estimate the sort of distribution/frequency curve that we’d see if we measured searches on the web. Of course the short tail would be tall and fat, but the tails would be very very long. This would be a great illustration for those who see “the long tail” as some techie gibberish.




paid search expert, sitecatalyst/omniture consultant, professional search engine optimization

Posted in Blogging, HitTail, Long Tail, Long Tail Blogging
3 comments on “Long Tail Blogging/HitTail Blog Response
  1. Mike Levin says:

    Good thought, Jeff. Of course, Google or people sitting on major chunks of the Internet backbone are in the best position to run such analysis. HitTail COULD do it with the humble few who are participating on our product, but the results are as predictable as the 1/X distribution curve itself. While the total number of clicks in any given day is actually a finite resource over which we all compete, some words like “the” will approach infinite searches along the y-axis, while the diversity of different words and combinations along the x-axis will similarly approach infinite.

    We did a number of analysis early on to determine how much time HitTail was REALLY saving people by chopping off the a portion of the infinitely long tail, and infinitely tall head. What is the “zone” or the sweet spot? What precise percentage of overall keywords up for consideration on any given website should be zeroed in on as worth your time in such a way that the traffic you’d pick up would be collectively worth it in the short-term.

    The answer is 5% of the terms found in the tail. And that’s after we’ve already filtered out 80% of “regular” pageview activity, and only took the 20% of traffic that carries “inititial referrer” data. So, it’s 5% of 20% of your traffic, something closer to .25% of your overall traffic is worth even considering as longtail keywords deserving of your immediate attention. And the the portion of the mainstream marketing who still don’t like the long tail should still be able to be convinced to look at .25% of their data based on the pragmatic argument that it couldn’t hurt.


  2. Jeff James says:

    very interesting Mike…

    Question for you:

    with the advent of personalized search, do you feel that practicing traditional natural search engine optimization can still be of any value? If we’re all seeing different results two years out…is the name of the game “content and coverage” or is it a crapshoot at this point and any prediction is highly subjective?


  3. Personal search, no matter what form it takes, is going to have to continually re-identify the user. We know this today as perpetually having to re-log-in to GMail, so that the overall system can maintain our identity. This is particularly an issue on terminals that have multiple users that are not forced to individually log in (the family computer). It will be less of an issue on personal mobile computing devices, where the user’s identity will be more reliable.

    But there are going to be so many circumstances where the device won’t be sure who is searching, that it will default to it’s normal (non-personalized) mode. This is also the case when it is not “safe” to deliver the personalized results, based on privacy settings in the profile. Over and over, we will find an overarching baseline result set, and then variance from that core set based on personalized settings.

    But since nearly every device will have to deal with the condition where it does not know who the user is (or cannot make assumptions of who the user is), then something very much like today’s search optimization will continue to exist. Of course, it will evolve based on changes in the industry, just like everything else.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Expert Omniture SiteCatalyst Consultant
jeffrey james - analytics consultant
Try Me - 30 Minute Troubleshooting or Strategy Discussion
View Jeffrey James's profile on LinkedIn - Omniture Consultant, SiteCatalyst Implementation Consultant and Consulting
  • Omniture SiteCatalyst Implementation/Reporting
  • PPC Management - large scale campaigns and scripts
  • Big-site Technical SEO Strategy
  • Google Analytics (and Premium) Consulting
>> Click to Email
%d bloggers like this: