First up is this story picked up by (among others) the BBC, and read by lots of people as a consequence. Let's concentrate on the following statement which was almost portrayed as the most interesting bit of the survey in the BBC's eyes (and a few others too)
For instance, Nielsen said, iTunes users were 2.2 times more likely to own a Volkswagen than the average internet user. Audi and Subaru were also popular with regular users of the Apple store.
The research also revealed that the most popular alcohol drink was cider followed by imported beers. Top magazine among iTunes fans was hi-tech bible Wired.
This comes from the same Nielsen who I found this week
The company's ratings are extrapolated from the viewing habits of 9,000 households, of which only 60 have DVR's.
I don't know about you, but reading that made me seriously question the validity of any of their research!
Back to the iTunes research... the key finding that struck me as bizarre is that Cider is the favourite alcoholic drink, followed by imported beer. I'm sorry, but there is no way that this makes any sense! If the survey was done just in the US, (when I lived there cider was a very, very infrequent drink at least in alcoholic form - though apple juice was sold as cider), I believe it even less. Even in the UK and Ireland, cider's market share is below 10% of "long alcoholic drinks". Now, if the question had been asked such as what is your favourite alcoholic drink:
a. Bud
b. Bud Light
c. Miller
d. Coors
e. Miller Light
f. Other american beer
g. Imported beer
h. Cider
Perhaps I could see how cider might come close to being dominant (but I'm still not sure). Otherwise, this result is completely bogus.
As for the Volkswagen finding, well, VW's market share in the US is what? 3%? 5%? So, how many people do you need to survey to get something significant like the fact that iTunes users are 2.2x likely to drive VW's? And is it any surprise? I'm sure not so many drive Oldsmobiles or Cadillacs. VW's are often the cars of the young in the US and in the more liberal (iPod-loving!) states anyway.
No, this research is completely useless (at least the way it has been portrayed). It is research that is done and publicised not for value but purely to push your own name out to the world.
Next up, is this piece of garbage reported by, among others, CNet
Admittedly I'm not the first to pan this, though I can't find the links. But essentially they claim that iWork has now taken 2nd spot in the Office software market after Microsoft Office, with a 2.7% market share. Anyone with an inkling of knowledge about market shares knows this cannot be true. Firstly, iWork does not have a spreadsheet application, so personally I don't believe it should be categorised in with these suites. But even if you count it, Apple's market share even in the US is, what, 5%? That implies at least half of all Apple users are using iWork instead of Office. And I do not believe for one minute that is true. I know of few iWork users (I am one). But even many of them would still use Office and particularly Excel some of the time.
I'm sure this story boils down more to bad research - not covering all the channels through which such software is sold (eg OEM, corporate channels), or not covering a long enough time slot (eg software sales will surge after a new release and decline hugely before a new one is expected).
So, to all those quoting these bits of research, please do a bit more investigation into what is being researched, why it is being researched and promoted to you, and whether it really is valid.
Footnote: I would note my part-time involvement in a small internet-based market research company http://www.tpoll.com. I would be horrified if they produced such garbage
2 comments:
If it's got more than 2,000 people and the sample is valid for the survey, then it's representative of the population.
The proportion of the population with VWs and which drink cider is known.
The proportion who visited iTunes and drink cider, or drive VWs, is known through this survey.
Hence - result. Quite a simple process, no?
Charles
Thanks for stopping by. I wasn't pleased with my posting - it was a bit rushed.
Of course you are right in theory, but if I can dwell on this point:
>and the sample is valid for the survey
That's the nub of my argument. And that's where to me most research falls down. I've given you a reason above why we should be sceptical of Nielsen given the recent debacle on tv ratings based on a group almost none of which had a DVR! Getting a representative sample that isn't self-selective is very hard. That's why things like epidemiological studies are very difficult to do. I would also hazard a guess if you took a random sample of people of the ages, locations, wealth etc but without the iPod bit you might find the same results, meaning that the iPod-owning aspect is actually immaterial.
One of the things I've observed from my partial involvement with MR is that you must correct for all sorts of things for research to be significant. For instance, if you collect a sample of 2,000 and it is somewhat skewed in a certain demographic, you have to correct for that. Overall this gets very hard to do - especially if you start looking at differences of just a few people either way.
Finally, there is the reporting, which can be skewed by culture. I had read "hard cider" in another report (can't remember which). That implies it is alcoholic cider. As I pointed out, in my experience hard cider was quite rare in the US. It could be that it wasn't hard cider but cider (inc apple juice) and get's reported here as cider. I don't know what it is as I don't have access to the full research (or more importantly, the questions). But what I do know for sure is that cider is not the preferred long alcoholic drink of any set of Americans (except perhaps the Cider Drinking Association of America). If any Americans would care to debunk that and say that hard cider (in all its brands) is now as frequently consumed as even one of Miller, Bud, or Coors, I might start to believe it could be true. But otherwise, there is something in the questioning, the analysis, or the reporting that doesn't add up and consequently an incorrect conclusion is made.
Anyway, I'm rushed today again, and still not making these points well. But when you say:
>Quite a simple process, no
That's what gets me worried. Messing around in statistics is never a simple process and we as a population are not questioning enough of the results we are presented with which nearly always carry some agenda.
Post a Comment