Recent Articles

Final TV Tally for 2010 Olympics: 190 Million Viewers in the U.S., 3.5 Billion Viewers Worldwide

Mike Reynolds’ article in Multichannel News reports 190 million US viewers for the full 2010 Winter Olympics in Vancouver. This is 3 million more than watched the Salt Lake City Olympics in 2002, 6 million more than watched the Torino Olympics in 2006. This year’s Winter Olympics was second only to the 1994 Olympics in Lillehammer – which had 204 million US viewers, in part because of the attention drawn by the Nancy Kerrigan/Tanya Harding incident.

According the Vancouver Olympics website, 3.5 billion viewers worldwide watched this years’ events. That’s billion, with a b.

This kind of viewership flies in the face of claims by emerging media that TV is growing irrelevant. This kind of massive simultaneous experience is in the middle of the TV’s wheelhouse, and the smaller screens seem to play a more of supporting role in them. TV is best at the while-its-happening experience where you want to see every detail and experience it all as if you were sitting in the crowd, but were somehow omniscient and capable of flying to whatever angle showed the action best. This strength is leveraged to best effect when there is a live broadcast such as the final Sunday night when the US v. Canada hockey final was battled to an overtime Canada victory. People watched on their TVs, and used their computers and smartphones to tweet about it.

Did You Play With Your TV? (a shameless plug for my #1 client, Ensequence)
If you saw the Olympics via Dish Networks or Verizon FiOS, then you could access weblike interactive content on your TV screen alongside your favorite Olympics events on MSNBC, CNBC, or USA – if you clicked and interacted, leave a comment and tell me about it. What did you like about it? What did you hate about it?

Cable Moving Steadily To Advanced Advertising (DBS Has A Healthy Lead In This Race)


Per this story by Steve Donahue in Light Reading Cable, Canoe is setting expectations for measured, steady progress in advanced advertising via cable. Interactivity is beginning to be rolled out now, but targeting at the individual household level is 4-5 years away.

Seth Haberman of Visible World is quoted in the article as estimating that 60-70 million households will be interactive and addressable and interactive during that 4-5 year timeframe.

Between now and then, the story will be all about EBIF deployment and steady increase in the sophistication of interactive capabilities offered. EBIF households should reach upwards of 20 million households by the end of 2010. DBS operators Dish Networks and DirecTV already offer substantial interactivity in programs and advertising to 29 million households. You might be wondering what the heck EBIF is. It stands for Enhanced TV Binary Interchange Format, but really all you need to know is that it is a set of standards that will make it possible to deploy the same interactive code across all platforms that have implemented the standard. It looks like that will eventually be most Cable MSOs and IPTV providers.

What does this mean? Well, it means the long-awaited promise of TV interactivity is going to be gradually fulfilled. For programming, that means enhanced content and audience participation. For advertising, it means addressability, interactivity, and response built into ads. Finally, it means T-Commerce, which will make shopping on TV as easy and ubiquitous and easy as shopping on the web, and that will be available in programs and in ads.

The question is this: Will the internet absorb the functionality of TV (“Over-The-Top” delivery of TV programming) before TV absorbs the functionality of the Internet? We will have to wait and see. I think both will continue to exist, but will morph and mutate differently because they essentially serve different different viewer purposes and usage occasions.

The winners will be marketers and advertisers who crack the code about the right division of labor between the Internet, television, and mobile, delivering brand experiences that take advantage of the unique strengths of each available channel.

If Congress Thinks Cookies Violate Your Privacy, Wait’ll They Hear About This!


If you were to go to the Scout Analytics website dig into the info about their offerings, you’ll find that they tested their patent-pending technology for the last six months on hundreds of thousands of users (see the Press Release entitled “Scout Analytics(TM) Quantifies the Inaccuracy of Cookies as a Measure of Unique Users‘) The two techniques they cite as the basis for this study: biometric signatures and device signatures. The release is more revealing about the biometric approach than it is about the device signatures. The biometric signature is essentially an identifiable pattern in a person’s typing style. The device signature is something they are vaguer about, saying only it is based on “data elements collected from the browser to eliminate errors in device counting such as cleared cookies”. The test was meant to see not only how much overcounting of unique users there was, but how many unlicensed users there were of subscription content via multiple use of the same user account.
I wonder if they got the explicit permission of the subscribers to have their keystrokes and machines profiled? If this kind of approach were to spread beyond detection of licensing violations, I wonder how much sympathy regulators and legislators would have for it?

Google Loses Italian Privacy Case

For most of us, the recent Amanda Knox murder case was our introduction to the Italian justice system. Well, according to today’s New York Times article, several Google executives have gotten acquainted with some further nuances. For example, if you host user-generated content, you can be convicted of violating someone’s privacy if an upload to your site violates it – even if you cooperate with Italian authorities in the removal of the objectionable content and identification of the culprits.

This is a serious threat to the open sharing of information that has driven the web’s rapid adoption and growth. To force sites like YouTube to do prior filtering and checking would impose a huge burden on such sites, and could alter the viability of their business model. Worse, though, legislation purporting to protect the citizens of Italy could instead result in robbing them of free access to the web and all its unpredictable and messy usefulness. If the world ends up divided between net-freedom-haves and net-freedom-have-nots, Italy could end up on the same side of that line as China. That is not the side I’d choose to live on, no matter how good the wine and cheese are.

The Cookie vs. The LSO – Should I Care? Should I Worry?

Here’s a question that savvy web users were being asked by their parents 10 years ago:
What the heck is a cookie, and why do I have them on my computer? Do I need to delete them? How do I delete them?

Don’t be surprised if the question starts to come up again, in a new form:
What the heck is an LSO, and why do I have them on my computer? Do I need to delete them? How do I delete them?

The issue is emerging again because of the people in the business of targeting ads or offers are trying to do their job better, and cookies are not doing the job advertisers want done. So, some web programmers are exploiting a feature of Flash to create “stealth cookies” called LSOs, in hopes that you won’t delete them because you probably don’t know how.

Remind me: What is a cookie again?
A cookie is a small text file that is created via your browser to keep track of session “state” and historic entries and site activity.

What is a cookie for?
The connectionless protocols used by the web do not automatically keep track of any history. If there is no state or history information provided with a page request, then the page will have no idea who you are, even if you just entered that info on a different page in the same site.

What’s so scary about that? Well, people just don’t like their activity being recorded without their permission or awareness. It annoys them. That said, there are useful things that this kind of snooping makes possible:

  • remembering your site settings and preferences
  • remembering and auto-entering your userid in the login screen
  • automatically logging you in when you arrive at a site
  • not showing you ads for things you don’t care about and would never buy
  • remembering the contents of your shopping cart from your last visit
  • remembering the contents of your wish list
  • .
  • At the same time, it makes possible:

  • targeting you for ads based on prior site searches
  • targeting you for ads based on prior site surfing
  • snooping and prying for evil reasons
  • .
  • Cookie Deletion
    When many people figured all this out it became a big kerfuffle, and this led to user behavior such that 23% of all cookies are deleted when they are one week old, and that less than half of all cookies (43%) live to be more than eight weeks old (click here to see Microsoft research about cookie deletion). Users can use functionality in their browsers to delete cookies and to control cookie-related policies within the browser.

    So who cares? What problems does cookie deletion cause?
    If you are an internet advertiser, it adds one more layer of complexity to the already difficult problem of tracking internet ad campaigns. You’ll have tracking pixels in ads to capture views and clicks, but knowing how many times someone has seen an ad during a campaign (frequency) and how many distinct individuals have seen an ad (reach) is pretty critical to understanding what is going on in a campaign, especially as more brand advertising comes online. Measurement is made difficult in internet advertising by these factors:

  • 1. The same person will browse from multiple computers
  • 2. The same person will see the same campaign on screens other than computers (smartphones, etc.)
  • 3. The same computer can be used by multiple people who may or may not have separate logins
  • 4. Many machines have multiple browsers installed, and a person might not always use the same one – cookies belong to a specific browser
  • 5. Some people severely restrict cookie functionality using browser security settings
  • 6. Many people delete the cookies from their computers, with different people doing so at different intervals
  • .
  • Net/Net: Bad Measurements
    On balance, these issues push the measurements in the direction of overcounting reach and undercounting frequency.
    Some of the other deficiencies of cookies from an advertiser point of view are that cookies don’t store very much information (4KB), and there can only be so many cookies related to a given domain (20). Privacy considerations additionally limit how much cross-site behavior can be captured in cookies (and banner campaigns are cross-site, mostly).

    LSOs Addess Some of These Shortcomings For Advertisers (Yay!), But Create New Ones for Users (Boo!)

    An LSO (Local Storage Object) is a cookie-like file that Flash uses to store information for Flash applications. Except that they are used by clever web programmers for far more than that – they are used by some sites just like really big cookies (as much as 25 times bigger than a cookie) that you don’t know about and so won’t delete. In addition, the same LSOs are accessible from all browsers. Your browser security controls have little or no impact on these things.

    You Might Want To Check Your Computer For LSOs Right Now

    If you don’t believe me, go to the Macromedia page that lets you see what LSOs are on your machine (it also lets you delete them, enable/disable them, and control their behavior).
    It is located here: http://www.macromedia.com/support/documentation/en/flashplayer/help/settings_manager07.html.
    While you are there, delete the ones for sites you don’t want your boss to know about.

    As for where this is all going, all privacy loopholes on the web are temporary, and there are already browser add-ins that let you control and delete LSOs, and at some point the browsers will absorb that functionality to make it easy for you to use. If I were you, I’d worry more about the things you can’t see: The new keystroke dynamics technique for identifying users announced by Scout Analytics (here) and backend ISP- and CDN- based tracking – all these are fodder for more paranoid posts in the future.

    Opening Weekend for 2010 Winter Olympics: 117 Million US Viewers

    Check out this article by Robert Seidman on TVbytheNumbers.com In it, he cites Nielsen ratings indicating this year’s opening ceremonies beat the Torino Olympics’ opening weekend by 5 million viewers. The average of 28.6 million viewers over the first weekend beat Torino’s first weekend by 25%.

    This beats the 106.5 million viewers last weekend for the Super Bowl (see the prior post in this blog), but that was much more concentrated in time. The Super Bowl got a 68 share while the Olympics first weekend got a 26.

    The Olympics also did well on the smaller screens. Three Olympics apps are currently in the top 10 on iTunes, and NBCOlympics.com traffic is 250% higher than it was for Torino. It has only been a few days, but there have already been more unique viewers for NBCOlympics.com during the Vancouver Olympics than than there were for the whole Torino Olympics.

    Play With Your TV! (a shameless plug for my #1 client, Ensequence)
    If you are watching the Olympics via Dish Networks or Verizon FiOS, then you can access weblike interactive content right on your TV screen alongside your favorite Olympics events. Once you tune to MSNBC, CNBC, or USA, a prompt will pop up (nothing on NBC itself, as far as I know). Clicking the “Select” button on your remote starts an interactive experience that includes Top Stories, Medal Counts, Athlete Bios, and more. Real interactive TV in the wild. Check it out!

    Nielsen Estimates 106.5 Million Viewers for Super Bowl XLIV (aka Beating the Pants Off Elvis)

    In a story by David Bauder of the Associated Press, A.C. Nielsen went on record estimating that 106.5 million viewers watched Super Bowl XLIV (see at WashingtonPost.com HERE). That is the most heavily viewed event in TV history, bigger than (according to Wikipedia):
    – the final episode of M*A*S*H (105.97 million viewers)
    – last year’s Super Bowl (98.7 million viewers)
    – the Beatles’ first appearance on The Ed Sullivan Show (73 million viewers)
    – Elvis’ first appearance on The Ed Sullivan Show (60 million viewers)

    To be fair to the shows of yesteryear, the total US population (and the number of households having TVs) has continued to increase since those days. There are about 305 million people currently living in the US, so about 1/3 of the entire population watched the game.
    It is remarkable in our modern splinter group society that we could find something that such a huge group of people could watch together, especially when you could rule so many people out right at the starting gate: infants, toddlers, anyone in solitary confinement, anyone unconscious or too sick to care, anyone at work in a job where you can’t watch TV while you work, almost anyone who was in an airplane at the time, and most people who immigrated from countries where a “football” is something spherical.

    I don’t know the full importance of this number, but it does suggest that:
    1. TV has not been made irrelevant by the Internet, despite Internet entrepreneurs’ claims
    2. People will still show up in giant hordes to watch a TV event en masse, if the product they are watching is enticing enough
    3. Ed Sullivan really blew it by making them shoot Elvis from the waist up. The least remarkable half of him got 60 million viewers. Who knows what the full Elvis could have scored?

    Alan Wurtzel’s Editorial in the Q3 Issue of the Journal of Advertising Research

    More good metrics reading from the JAR: After my prior posting on metrics articles in the Q4 Issue of the Journal of Advertising Research, it occurred to me that I did not mention the editorial that NBC’s Alan Wurtzel wrote in the Q3 issue…

    Now. Or Never – An Urgent Call to Action for Consensus on New Media Metrics” by Alan Wurtzel, President of Research and Media Development at NBC Universal
    In this editorial, Alan Wurtzel lays out what he believes is a critical juncture for measurement of new media. He sums it up this way: “You can’t sell what you can’t measure, and, unfortunately, our measurement systems are not keeping up with either technology or consumer behavior.” The problem isn’t a lack of data – the samples are getting bigger and the precision is getting greater. The problem is that technical challenges exist that make it hard to assess the validity of measurements. Without precise and transparent definition for how data are gathered and metrics are being calculated from data, Wurtzel says programmers cannot depend on the numbers as a basis for decision-making. Proprietary considerations are holding vendors back providing this level of visibility into their processes.

    Wurtzel cites a case – quoted by many sources last fall – where NBCU purchased and compared viewership data for the “Heroes” finale from several different set-top box (STB) vendors. The difference between the highest and lowest measurement of the show’s ratings was 6% – which translates into $400,000 of difference in revenue. While 6% sounds low, the “Heroes” example had high enough ratings that they should have had relatively low variation in measurement, meaning that the variation in ratings for lower-rated shows would be much worse. And this is variation in purportedly directly-measured STB data, which should have had little variation at all.

    According to Wurtzel, there are serious differences between different vendors that cause this variation. For example, there is no standard way to determine whether or not an STB-attached TV is on or off from the STB data stream, so every data vendor has come up with their own algorithm for deciding when the TV is on or off, and they aren’t sharing these algorithms. There are other similar “edit rules” that each vendor carefully guards. This creates differences in the measurements generated. Now, when you think of the task as not just measuring TV, but an integrated understanding of how a program works across three screens (TV, Mobile, and Internet), now you are looking at huge gaps in comparability and meaning of metrics from screen to screen.

    This was written last Fall. What grew out of this thinking was the CIMM, which I have discussed in prior posts. What is likely to happen in the long run is anyone’s guess, but Alan’s article reads like a set of product requirements for the ultimate three-screen audience metrics platform, so the best outcome would be for some smart entrepreneur were to develop just such an offering. Hmmm… I’d say keep your eyes on the marketplace.

    The December Issue of the Journal of Advertising Research (JAR) has Great Metrics Articles!

    There are a couple of useful articles this month in the Journal of Advertising Research. They are on a roll over at the JAR, driving some great discussion in the last few months about measurement of marketing, digital and otherwise. Recommended reading in this month’s issue:

    Commentary: Who Owns Metrics? Building a Bill of Rights for Online Advertisers”, by Benjamin Edelman, Harvard Business School Assistant Professor in Negotiation, Organizations & Markets
    Ben Edelman, who has written on the role of deception and overcharging in online media (among other topics) is right on target here – he argues that advertisers have a right to know where and when their ads are being shown, delivered in the form of meaningful, itemized billing. He also asserts the advertisers’ ownership of the data that comes from their campaigns, and says they should (for example) be able to use data collected from their Google PPC campaigns to target campaigns on MS AdCenter or Yahoo! This is definitely a controversial area – certainly Google, along with cable and satellite TV operators, would disagree – read it and let me know what you think.

    It’s Personal: Extracting Lifestyle Indicators in Digital Television Advertising, by George Lekakos, Assistant Professor in e-Business at the University of the Aegean, Greece.
    In case you think my comment about TV distributors wanting to own audience data is irrelevant in the context of digital marketing, Lekakos lays out a scheme for using set-top box data to discover and target lifestyle segments that are then used as part of a targeting algorithm. The author lays out an approach by which TV set-top box data can be used to drive very accurate personalization and targeting of ads, but the question of whether the data belongs to the distributors, the programmers, or the advertisers is quite critical to whether this can be implemented. I’d have to say that the question is far from settled.

    Measuring Advertising Quality on Television: Deriving Meaningful Metrics from Audience Retention Data<by Dan Zigmond, Sundar Dorai-Raj, Yannet Interian, and Igor Naverniouk
    The authors explore the use of audience retention metrics captured via TV set-top boxes as a measure of ad quality. They use a “retention score” that purports to isolate the effect of ad creative on audience retention, and link it with future audience response and qualitative measures of ad quality. They assert its usefulness as a relevance measure that could be used to optimize TV ad targeting and placement. Again, we should note that the issue of data ownership needs to be dealt with if this approach is going to be applied widely.

    The Foundations of Quality (FoQ) Initiative: A Five-Part Immersion into the Quality of Online Research, by Robert Walker, Raymond Petit, and Joel Rubinson
    To address both the increasing importance of online research and questions about its validity, the FoQ Initiative has been undertaken to measure the quality of online research. The Online Research Quality Council included large advertisers, ad agencies, academic researchers, and research suppliers in the process. Among the issues they addressed: accuracy, representativeness, and replicability of results, identification and handling of questionable survey-taking behaviors, and the suspicion that small number of “heavy” online respondents are taking most online surveys.

    Some of the interesting findings:

  • There is significant overlap in membership of various online research panels, but no evidence this causes data quality issues
  • Multiple panel membership actually lowers the odds of “bad” survey-taking behavior by 32%
  • You should keep surveys short – longer surveys increase the occurrence of “bad” survey-taking behavior by 6X
  • Age matters – younger respondents had 2X the occurrence of “bad” survey-taking behavior than older ones
  • Facebook Dominates Social Media Searches (Yet More Fun With Google Trends)

    Playing with tools is fun – I did another Google Trends search, this time comparing “Facebook” to “MySpace”, “YouTube” and “LinkedIn” as reference points. Wow – searches for “Facebook” have really grown amazingly fast (see the first chart, below). I wish I had bought a piece of that company 2-3 years ago.

    It occurred to me that there should be a corresponding trend in searches for “social networking”, relative to other online marketing activities (e.g., email, search, display advertising). Searches for “social networking” have had a huge growth rate, but the absolute volume turns out to be really small compared to “email” and “search”. I guess there is still time to get on that bandwagon. The search volume for “Facebook” crushes that for those terms, but this is made harder to interpret by the fact that these are much more likely to be searches by users, not just marketing professionals.