Navigating the choppy waters of…

Annie goes with her convictions. She has strongly held beliefs about the world, and holds on to evidence that backs them up. She’s not certain about everything of course, but she gets most of her information from a limited range of newspapers. Annie believes data can make a difference but only if analysed by the right people.

Beth, on the other hand is too grown-up for any of that nonsense. She reads widely and knows that evidence can be twisted in different directions depending on who sets the agenda. She’s largely given up reading the news now as she knows there’s no hope for the truth in a world of sound bites and political spin.

You may be completely an Annie or totally a Beth, but it’s more likely you’re somewhere in between.

As statisticians, we have confidence that data can assist and improve the quality of public decision-making. It’s happened before, when politicians have legislated to eliminate the harmful effects of leaded petrol or widespread smoking. But it’s often difficult to navigate the choppy waters of political claim and counter-claim, where the truth can be so hard to pin down.

Full Fact offers a way out for Annie and for Beth. It’s an independent, non-partisan factchecking charity that exists to promote accuracy in public debate. Its dedicated factcheckers investigate behind the scenes of bold political claims, publishing articles that help readers weigh up the evidence on either side of the argument and reach conclusions for themselves.

With some 30,000 Twitter followers and thousands of readers each week, Full Fact is a recognised source of impartial information to help people of all political persuasions reach reasoned conclusions. The charity’s work has secured corrections from almost all UK major national newspapers, and its factcheckers have shared expertise everywhere from the ‘Today’ programme and ‘Question Time’ on the BBC to Sky News and CNN.

During the 2015 general election, Full Fact ran an 18 hours per day, 7 days per week factchecking operation to help with these tricky political claims.

Here are some of their insights into how data isn’t always what it seems.

Comparisons over time

Let’s say you want to know how sales of music have changed over the last fifty years. If you just counted the number of records sold, you’d miss the major changes that moved the market through cassette tapes and CDs to a world where almost all music is bought online. Fail to account for those changes and you fail to understand the true story of music sales over time.

Many political claims suffer from the same problem: comparing through time doesn’t always make sense when there are other changes afoot.

This year’s UK election campaign had its fair share of such claims. For example, Ed Miliband claimed that three times as many people are on zero hours contracts now compared to 2010.

The statistics available do not accurately show a trend over time, as the Office for National Statistics (ONS) makes clear.

When the ONS measures the number of people on zero hours contracts in their main employment, it does so using an interview-based survey. But the results of this survey are only accurate if people know what their own contract is.

Zero hours contracts attracted a great deal of attention over the past couple of years, and the increase in public awareness is likely to have led to a corresponding increase in the number of people surveyed reporting that they are on zero hours contracts independent of any change in how many people actually work on these contracts.

Nobody knows how much of the increase really is more people on zero hours contracts, and how much is just increased awareness.

So the next time you see a claim that something’s changed, ask yourself: what else could have changed in that time?

Just Desserts: Know your statistics

Dessert or desert? Words that sound the same can have very different meanings. Similarly, factual claims that sound the same aren’t always what they seem.

For example, during the 7-way leaders’ debate in April, David Cameron said: “We have created two million jobs”. The two million figure is correct, but there’s a difference between the number of jobs and the number of people in employment.

One person can have more than one job, and one job can be done by two or more people, for instance through job-sharing. These differences sound small, but multiply them over the workforce and they’re crucial: there are 31 million people in employment, but 33.5 million jobs.

Similarly, confusion was caused when Ed Miliband made a speech about how easy it was to get an appointment with a National Health Service doctor in England. These are known as General Practitioners, or GPs. Miliband said that a quarter of people “can’t get an appointment with their GP within a week.”

The best figures on GP waiting times show that 11% of patients couldn’t get an appointment at all the last time they contacted their GP surgery, while another 14% saw or spoke to someone a week or more later.

So it’s fair to say a quarter of patients don’t get an appointment within a week. However, that doesn’t mean they can’t get one. Some people would have been happy to wait, for instance if they wanted to get a repeat prescription or if their need for an appointment wasn’t urgent.

So a simple change of the word “can’t” to “don’t” make this discussion of the figures more accurate.

Spurious Certainty

In 2006, six men were taken critically ill during clinical trials for TGN1412, a new drug being tested for the first time in humans. While cases as serious as this are very unusual, they serve as a useful reminder of the importance of recognising what’s not known. If this drug dose had simply been given to patients without being sure of its effects, the result would have been catastrophic.

Statisticians and scientists are typically wary of making claims without a solid foundation of evidence. Politics and the press do not generally show the same restraint. For decision-makers under pressure, waiting for more evidence isn’t always an option.

One example from the election campaign is the free schools programme. Both the Conservatives and Labour made claims about how well free schools are working—the Conservatives saying free schools are “delivering better education for the children who need it most” and Labour saying the initiative is “wasteful and poorly performing”.

Unfortunately for voters and decision-makers, it’s really just too early to say.

At the time of the election there were 254 free schools open, only a third of which had been judged by the schools inspectorate, Ofsted.

Exam results data tells us even less. Most free schools are so new that not many of their students are old enough to sit exams yet. In fact headline exam results data is available for just 14 primary free schools, and 10 secondary free schools.

So it’s hard to say anything meaningful about their performance until there’s more evidence available. It sounds pedantic to say, “It’s too early to tell”. But accepting the limits of evidence in public debate might be better than pushing that evidence beyond its limits.

Beware the baseline: records and rates

It would be no surprise to hear that the number of car crashes has gone up since 100 years ago. There are more cars now than there were 100 years ago. To tell this story fairly, it’s best to look at the rate rather than the level.

Similarly, Britain’s population is rising. Many claims about record numbers of people boil down to that simple fact.

A classic example comes from claims about records in England’s Accident and Emergency services in hospitals (A&E). According to Labour, there were a record number of people waiting more than four hours in A&E; the Conservatives replied by saying there were a record number of people not waiting more than four hours in A&E.

Perhaps the most revealing figure is the baseline: there were 22.4 million A&E attendances in England last year—a record number. That was 1.3 million more than in 2010 and around 3.7 million more than in 2005.

So beware the simple headline: records are not always what they seem.

Conclusion

Full fact exists to shed light on claims we hear from politicians and the media. But we as statisticians have a part to play in communicating this to the world. Doing statistics right isn’t just about calculating the right confidence intervals, but increasingly commanding the confidence of our readers.

This article is based on material from Full Fact’s election report (available at: https://fullfact.org/wp-content/uploads/2015/05/full_fact_general_election_factcheck_2015.pdf).

Views expressed here are the author’s alone.

Posted in Informing Decision-making, Posts from feeds, Transparency Tagged with: , , , , , , ,