comScore / Nielsen NetRatings audit saga, pt.2

Two new articles just came out regarding the IAB’s pressuring of comScore / NetRating to have their panel methodologies audited.  

1)  “The Travails of Tracking Web Traffic” by Catherine Holahan (BusinessWeek, April 30, 2007)

2) “IAB Call for Audits: Transparency or Conspiracy?” by Kate Kaye (ClickZ News, April 20, 2007)

Both articles mention one of the more nimble competitors in this space: the up-and-coming  Compete.  Another company mentioned is Quantcast.  Hopefully these smaller firms will have the opportunity to innovate while the whole cycle of disbelief-scrutiny-pressure-and-recalcitrance plays out between Nielsen/comScore and the IAB’s publisher constituents .

What follows in the next two sections of this post is a bit of a dissection of the above two articles.  Click below to read more, if you’re into web analytics (or if you work for one of the Big Four auditors)…

1)  “The Travails of Tracking Web Traffic” by Catherine Holahan (BusinessWeek, April 30, 2007)

This piece covers the issue nicely and is well organized, like many of BusinessWeek’s articles. It works well whether you are new to the subject or are just looking for an update and more complete perspective. However, it did fall into a couple of spin-doctor traps:

a) The article explains why user panels fail, sometimes by orders of magnitude, in predicting the total traffic to a given website. However, right after doing that the article proceeds to mention that panels have certain advantage over server logs in that, “Unlike server logs … panels can tell companies more about the audience’s demographic composition. This is possible as panel members share such information as age, gender, and income level.”   Yes, that’s what part of the whole panel pitch.  But no, it’s not true.

See, by extrapolating their user panel results, these analytics firms have failed to guess even the order of magnitude of a given website’s traffic.  The composition of that panel and the erroneous total derived from it, in terms of user demographics, is part of the problem math.  Therefore, the demographic profiles they report are just as likely to be grossly incorrect. There is no reason to trust those demographics if you can’t trust those totals… because the two measurements (quantity and quality) are tied together in the methodology itself.  They can’t say, “well, we might be wrong on the scale of this, but qualitatively we’re still correct in the audience makeup.”

(The article later mentions the “niche demographic” aspect of online (non-mass) media, which is related, but it’s not exactly the same issue. ) 

b)  The article correctly points out that traditional server logs have the weakness of not accounting for web surfers who switch computers (though I wonder if those panels are any different in this respect?  Does your Nielsen NetRatings tracker follow you from home to your work computer?  I don’t think so. ) and that although the server log and cookie-based tracking systems can give a more precise count of visits to a Web page from unique IP addresses, “... it can also be thrown off by users who periodically delete the bits of tracking code from their computers.”  Absolutely true.

I would have been cool with that, and I would have been happy if the article then quoted a reliable authority on the percentage of users who periodically delete their cookies. They did this, except… the “authority” in this case was comScore! Jeez, come on.  comScore pegs the number of users who periodically delete their cookies at 30%.  Ok, just because it’s from comScore, I’m gonna discount estimate that by half… no, by a quarter, and call it 7.5%   comScore’s CEO said, “These serial resetters have the potential to wildly inflate a site’s internal unique visitor tally, because just one set of eyeballs at the site may be counted as 10 more unique visitors over the course of a month.”   These serial resetters?  It sounds scary. 

In any case, there is probably a range of percentages, based on how “tech savvy” a given user group is, on this cookie-resetting factor. Whatever that percentage is, it would be an okay discount to apply to a given server log totals, and in terms of statistical deviation it would not be orders of magnitude off reality.

2) “IAB Call for Audits: Transparency or Conspiracy?” by Kate Kaye (ClickZ News, April 30, 2007)

Unlike the BusinessWeek piece, this is the insider’s article on the issue.  You want to hear what people are saying on the sidelines?  It’s all there.  

I think all this scrutiny and debunking of comScore and Nielsen will accelerate the trend of more and more advertisers asking publishers for third-party audited numbers on their traffic… and for publishers to have those audited numbers ready for their advertisers.  I disagree with the fear mentioned by an anonymous media agency insider, who “thinks exposure of the NetRatings or ComScore innards may backfire. If advertiser trust in these large two measurement firms erodes as a result of audits, their trust in online advertising could weaken, dampening enthusiasm towards Internet spending. ”   In the short term, some of the agency’s large CPG (consumer packaged goods) clients may cite the article they saw about this issue in the Wall Street Journal and threaten to reduce their online ad spend budget… but in the end, they’ll keep buying more, because those online ads bring measurable results (in terms of clicks, and where applicable, online revenues), and those measurable results usually outperform whatever they’re getting from the glossy magazine ads and T.V. spots that they’re still buying. 

CONCLUSION:  Clearly, the big winners who will come out of this will be the “Big Four” accounting firms.  The owners of major content websites have already started shelling out big bucks to get a “Big-Four-certified” audit of what is really a fairly straight forward (and largely automated) data analysis.  This is a consulting/accounting services firm’s dream:  An easily scaled, commodity-type service that can be sold at a premium. 

At the end of the article, Stephen DiMarco of Compete is quoted, “Is this going to have the impact of a [Sarbanes-Oxley]? That sounds like a stretch.”    He’s right from the point of view of the auditors, but maybe not entirely from the point of view of major publishers.  The outcome of the debunking of panel-based traffic estimates will add to the cost of doing business for major publishers, just as Sarbox has added over $2 billion in accounting costs to public companies.  [ Mind you, how much can Deloitte, Ernst & Young, KPMG and PwC charge for server log audits?  It couldn’t be that much… could it? 😛  ]   For auditors, however, this will finally be a case where the added billable work will not create a tremendous CPA labor crunch, the way all the FASB/SEC-related legislation did. 


About danspira

My blog is at: My face in real life appears at a higher resolution, although I do feel pixelated sometimes.

Posted on April 30, 2007, in Accounting, Advertising, Analytics. Bookmark the permalink. Leave a comment.

Leave a Reply -- for humans only, no spambots

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: