Director, Full Fact
On the 31st March Ed Miliband declared there was an “explosion” of zero hours contracts. “There are now three times as many people on zero hours contracts as there were when this government came to power.”
Independent factcheckers Full Fact sprang into action. We knew — thanks to thorough preparation before the election — that the statistics cannot and do not show an ‘explosion’ because comparisons of zero hours contracts over time aren’t reliable (as the ONS has made clear).
We sent out a press release and soon journalists were tweeting their favourite facts, for example that 66% of people on zero hours contracts don’t want more hours.
We went on Sky News, BBC 5 Live and Good Morning Wales to set the record straight, and our research was quoted in the Mirror, The Sun, Daily Mail, Guardian and Independent.
Labour continued to make their case about employment insecurity, but stopped using the flawed claims about zero hours contracts. Our job was done: the political argument could still happen, but now it was routed in reality.
The Election Centre
That was day three of our election project. Based in King’s College London’s Anatomy Theatre, we monitored claims from the parties in press releases, social media, broadcast, newspapers, interviews and speeches.
The project was conceived as a two-election venture: experimenting in 2015, and scaling up what works for the EU referendum and 2020 general election. Over the 18 months leading up to the election, a set of expert research partners came together including the Health Foundation, the Institute for Criminal Policy Research, the National Foundation for Educational Research, the Migration Observatory and NatCen Social Research. They reviewed and wrote briefings on topics ranging from immigrants and welfare benefits to job options for young people.
Expert organisations and volunteers from the Government Statistical Service, ONS and Ipsos MORI enabled us to respond to new claims faster. For example, during one leaders’ debate, a claim came up that we hadn’t seen before: “We need to build a house every 7 minutes just to cope with immigration into this country”. At 9pm we consulted the Migration Observatory at the University of Oxford. That night our response was published on Buzzfeed, reaching 28,000 people.
As well as connections to outside experts, live factchecking relies on thorough preparation. One of our biggest challenges was keeping our database of factchecks up to date (and therefore useable). Previously preparation for live-factchecking consisted of putting potential tweets and blog posts into one shared Google document – but often these were lengthy and it was difficult to find the relevant content.
For the last year we’ve been developing a new system: a database of claims and factchecks, and prepared tweets or posts that can be published straight away. As well as storing claims and factchecks, the current database also tracks claims: when they were first made, and where they’ve appeared over the years in House of Commons debates or on Twitter. There’s still a lot of work to do: we’d like the database to track when claims appear in the media, and automatically flag up ‘out of date’ content: sometimes inaccurate claims become accurate when the statistics show changes – for example claims about employment or the deficit.
How did it actually work?
The three main teams were Monitoring, Analysis and Communications. Monitoring produced the raw material for factchecking, trawling through the papers at 6am and listening to hours of radio shows. The election centre could not have functioned without our volunteers, who altogether donated 3192 hours of their time.
Analysis was staffed by experienced factcheckers and volunteers from the Office for National Statistics, the Government Statistical Service and Ipsos MORI. We were able to lean across a desk while investigating a claim and say, “You helped compile these statistics, can you help me find this fact?”
Communications made sure our findings were on the right channels, so that they’d be seen by large and varied audiences or prevent inaccurate claims being repeated in future.
What difference did it make?
We have only just begun to take stock and will spend the next few months analysing the project and digesting the results of an independent evaluation being carried out by NatCen.
That said, we could see the results of the election centre in full swing: the Labour party changing the way they made their argument so it was based on the facts; the Trussell Trust altering their press release and agreeing to work with us in future on how they present their figures; hundreds of positive and thoughtful tweets from people following our live-factchecking of debates; and a correction live on Newsnight for a claim made at the start of the programme.
There is still a huge amount to learn about factchecking future elections in a way that makes an impact on the quality of debate. Despite an expanded team and 1300 individual donations, we didn’t get time to explore everything we wanted to.
But we can see a change in the wind. More and more it looks like live-factchecking is an expected part of serious political discussion. Whether on Twitter, live blogs, instant video clips, or eyeball to eyeball on the BBC, we made sure politicians knew that what they said would be checked. Those 1300 backers show how much people care about whether they’re fed truth or spin: the wind is in our sails.
Refer 31 (2) Summer 2015