Audiences – the lifeblood of the media
Audience figures… the lifeblood of media organisations. The bigger your audience, the more you can charge advertisers for promoting their brand.
But how do you calculate an audience, and how do you know whether that’s a good or bad figure?
Here’s a story from 3FM this week about viewing figures for the first ever live broadcast from Tynwald.
MTTV used livestreaming app Periscope to show live video from the Tynwald chamber, and the audience figures reached around 375 for the stream (not including those who watched after the fact).
Using the population of the Island at around 88,500 it shows 0.004 per cent of people connected on the live stream.
Props to Paul Moulton who set the stream up, but he wasn’t impressed by “negative coverage”, criticising the fact the entire Island population was used to calculate a figure.
But even using the economically active population figure given by the government in its last unemployment data, it’s still a very small number of people.
1.3 per cent unemployment represents 579 people. That’s 44,538 economically active people on the Island. Which means less than 1% of the total workforce watched the stream – by the time you include pensioners and people too ill to work, it’s still a very small figure.
Why audience size matters
It’s no criticism of the attempts to televise Tynwald: any effort to give people access to parliament and their democracy is worthwhile.
But it does sharply focus the reasons most outlets don’t offer much live coverage of the court’s proceedings: the audience is simply too small to make it worthwhile.
Commercial considerations such as those are very important when it comes to most media businesses. They’re in the game to make money, of course.
And if you’re going to advertise your company, you’ll want to know how many people will see or hear your advert.
And if you’re about to spend your company’s money, you’ll want to know where those figures come from: if they’re plucked from the air, why should you trust them?
Under the RAJAR
Interestingly, of the three radio stations on the Island, only Energy FM doesn’t take part in RAJAR
RAJAR is the industry-standard way of measuring radio audience. It’s far from perfect, relying on a sample of people to fill in “listening diaries” and then multiplying that up.
It explains, for example, why presenters constantly tell you which station you’re listening to: if you’re about to fill in your diary, they want you to write their station down.
But it is the industry standard, used by almost every radio station in Britain and widely recognised as the most accurate way of gauging audience levels.
A Facebook post from Energy FM a couple of weeks ago had this to say about its audience:
Wait. Tens of thousands? Is that per week or at any one time? What does that figure mean?
There’s a page on Energy’s website that deals with why you should advertise there.
There are even more questions raised here, and not just about the English (“most text number’s on the Island”?).
The Facebook post says the station gets tens of thousands of listeners. But on the web page, the station says it doesn’t use RAJAR. So where does that number come from?
There are lots of reasons given about why RAJAR is such a terrible thing. Let’s look at some of those.
“The Survey figures are released almost 3 months after the survey. It’s out of date the moment it’s released.
The survey covers a rolling period of 12 months at 3 month intervals – Most of it is so far out of date it’s irrelevant.”
If they were released 10 minutes after the survey, they’d be inaccurate. Some people would have tuned in, and others turned off. RAJAR gives a snapshot, not a live number. Does Energy have current, live listening figures? If so, how?
“The sample size for the Isle of Man is so small, in our opinion it is open to massive error.
The sample size is a total of 612 people in a whole year. (Source Rajar Jan17)
Less than 12 people a week are surveyed across all age ranges. Yes less than 12!”
Last things first: it isn’t 12 people a week (let alone fewer than 12). RAJAR isn’t sampling every week. If I asked 100 people tomorrow what they thought of something, that’s not the same as asking 2 people a week over a year. Dividing the RAJAR sample size by 52 is meaningless.
So is RAJAR’s 612 people too small a sample?
Let’s do some A-level (possibly even GCSE) statistics. Here’s a handy page with a minimum sample size calculator:
Confidence level is how confident you are the figure is accurate. Confidence interval is your margin for error. Population is the total possible audience.
Try popping these numbers in: 95% confidence of a 4% margin of error in a population of 88,500.
Minimum sample size needed? 596. Significantly below RAJAR’s 612. So actually, the sample size is large enough to mean there won’t be massive errors, no matter what anyone’s opinion is.
The War on Error
On top of that, consider this: if the sample size was so small it created massive errors, why does every survey produce only very small swings in listenership figures? Surely you’d expect the numbers for the other two stations to vary wildly if it was prone to inaccuracy?
Instead, as you’d expect, there are small fluctuations every three months – in keeping with a system that has only small margins of error.
“The figures are not an exact figure. Rajar themselves admit it only produces an ‘estimated weekly average reach and hours’ – in our view it’s like sticking your finger in the air.”
Not quite. RAJAR’s methodology uses mathematics rather than sticking fingers in the air. If you ask me, sticking fingers in the air is more likely to produce a figure along the lines of “tens of thousands”.
All of which begs the question: if you don’t believe in RAJAR’s methods of measurement, how exactly are you measuring your own audience more reliably and accurately? What’s your sample size? How are you collecting the data?
As Energy concludes: “As they say, there are lies, damn lies and there are statistics!”