Politifact.com is a fact-checking website founded in 2007 to evaluate the truthfulness of politicians’ claims. It now covers national, local and world politics, and keeps track of the campaign promises kept by politicians. Amy Hollyfield is a journalist for the Tampa Bay Times who directs Politifact.com.
MEGAN MCQUEEN (MM): I saw that Politifact was founded in 2007, as an offshoot of the Tampa Bay Times. What was the motivation behind its initial creation? I know the 2016 election was huge on fact checking, but in 2007 it was less of a movement.
AMY HOLLYFIELD (AH): It starts with our political coverage of presidential campaigns. Bill Adair was our Washington bureau chief at the time, and he was frustrated with covering campaigns. He covered one of the conventions in 2004, and he was watching a speech and thought a lot of what the person was saying was not true. But he was so busy trying to get quotes, getting his story ready for a deadline, that he couldn’t fact check it. And from that moment, he was like, “we’re going to do this differently the next election cycle.” He pitched to our editor in our newsroom this idea of doing a fact-checking website, and instead of covering the campaign we always had—going out with the candidates, being at every speech—we would instead watch from afar, and fact check what they said.
MM: Were you with Politifact from the beginning?
AH: Almost. The website went live in, I believe, August 2007, and I came in November.
MM: How did you get involved with Politifact? Were you already working with the Times?
AH: Yes. I was a page designer at the time if you can believe that. I had been involved in some of the conversations in the newsroom about the website because I was in charge of design. At first, it was a question of what it would look like. But as I sat in those meetings I thought, “This is so cool. I want to do this reporting.” So I raised my hand, asked our editor, “Can I do it?” And he said, “Go try and see how it works out.” That’s the beginning of the story. I came over, started reporting, and got quickly involved in Barack Obama and his citizenship. That was a big story that election cycle. So I tracked down everything to do with it.
MM: So could you give me a brief explanation of the vetting process for claims you end up fact checking? For example, how you begin the process, who does what, etc.?
AH: In deciding what to fact check, it comes down to this: what would a reasonable person wonder is true? What is driving the news? What are people talking about? What are they wondering about? That’s our barometer. As far as how we find them, we scour transcripts of speeches and events; we scour congressional records, social media, mailings—yesterday we sat and watched Donald Trump’s press conference and went point-by-point through that. We have a pretty small team. There are only eight of us. But it’s grown—at the beginning, it was pretty ad hoc, but it’s gotten bigger. So within our team of eight, everyone has different jobs, but it begins with editors assigning fact checks. We have something called a buffet, which is a spreadsheet we use to keep track of what people have said, when they said these things, and links to the video or whatever of their statement. Then in deciding what’s important, well, sometimes the news just drives it. For example, there was no question we were going to spend time on that press conference. But today, we have the flexibility to wonder, “Oh, what’s going on today?” Maybe it’s something we are picking up from before the press conference, or something new altogether.
Next, the assignment goes to a reporter. The reporter will do some extraordinarily in-depth reporting. It’s pretty different from a straight-up reporting assignment because we rely on original sources. We’re not just doing a Google search and calling it a day, we’re not just finding a reference in a New York Times story, but we’re tracking down the people that were involved, the studies that were done, and the original government reports that were referenced. Sometimes we’re finding the professors who authored the studies so we can ask them about how their studies were represented. We also always call a cross-section of experts in the field of whatever topic we’re talking about. We’re careful to not be partisan in those selections. If we ever are, we are up-front about the leanings of the people we talk to.
MM: Speaking of expert opinions, I’m reminded of the cases where people deny climate change by pointing to a select group of studies and rejecting ones done by so-called “establishment” researchers. In choosing your sources, you have to be careful about picking ones that have the trust of your readers. Many people on the far right—Trump supporters particularly—have communicated their distrust of the media and fact-checking sources. Have you been able to reach any of those people?
AH: I’m not sure we have. That’s something we’re talking about this year—how do we reach some of the people that seem turned off by the work that we do? Our audience is a pretty educated one. They care. We got into this as a public service, to help people sort out the truth in politics. So we figure we’re talking to a group of people that are pretty engaged, that are interested in what’s up and what’s down. We have to recognize we live in partisan times. And you can see, from cable news and how partisan it is, that some people like to hear what they like to hear. We’ve thought a lot after the 2016 campaign about how we can do a better job of reaching out to those other audiences.
MM: So what would that look like?
AH: I don’t know. But it’s on our minds. For us, it’s a public service, and we want to be widely read. We enjoy a terrific audience and great popularity, and that’s tremendous. I’d say that when we get criticism, it’s pretty equally from both sides.
MM: I have looked at some critiques of Politifact that accuse it of being both left leaning and right leaning. Do you think it is possible for a fact checking agent to be truly unbiased? A lot of the judgment calls seem somewhat like different interpretations of the truth. Especially with things that might be partly true, partly false. How do you moderate this potential for bias?
AH: We like to say that we are biased toward the truth. If you’re going to call us biased, that’s our bias—toward truth. We’re all highly trained journalists who have been at this for a while. I’ll finish telling you about our system and how we rate because it’s a pretty thorough process. It has a lot of checks along the way so we can ensure we’re fair.
To go back to the reporter: now that they’ve done all of that great, original reporting, they’ve written up a story, and they’re going to turn it in to their editor. The editor will knock it around—make sure there are no holes and that they’ve proved their case for whatever ruling they’re suggesting. And, mind you, it’s just a suggestion from the reporter.
After the editor is finished editing it, we take it to something called the star chamber. It’s a group of three editors who read the item—the original editor and two others. All three of them read the fact check then have a conversation over what the truth rating should be. We have some questions we ask ourselves, we check our jurisprudence, and we think: “Did we reach out to the person that made these comments?” That’s one of our tenets, to reach out.
MM: Do they normally respond?
AH: Sometimes they do, and sometimes they don’t. We always try to say in our articles whether somebody didn’t participate. But that’s one of our major things; we make sure we reach out to the speakers.
MM: What do you ask them?
AH: We ask them about the statement they made, for example, if they have any particular evidence. It’s funny what you find out when you call. Sometimes they’ll be like, “Hey, I didn’t mean it that way” or “I was trying to say this.” But one of our other tenets is that words matter. So even if somebody goes on TV and makes a statement that is wholly wrong and when you get them on the phone they say, “No, no, that’s not what I meant,” the audience doesn’t know that. They won’t get an individual phone call from [the speaker] like we did. We’ll include that information, but won’t let it affect how we make our ruling.
MM: Feel free to go on about the process once the piece has entered the star chamber.
AH: Well we get in the chamber, and we have a conversation. It can either be a really easy conversation, or it can be a contentious one. There are six gradations on our scale, and I think someone who’s not familiar with it might think it’s hard to sort out, but for people who’ve been at it for so long, we’ve done over 13,000 fact checks. To have participated in thousands of those, we all have a pretty good understanding of what the lines are between “true” and “mostly true,” or “false” and “pants on fire.”
MM: I’m pretty familiar with the site, but if you could explain what the different ratings mean, that would be great.
AH: Well, “true” is obvious. That means there’s nothing to clarify. There’s no “Oh, but…” If you can use the word “but,” it’s “mostly true.” It’s a hair off. The numbers are close, but you have to make one clarification, something small. Largely accurate is what the definition says.
“Half true” is our most vexing rating. If you were to ask the people that get fact checked or any of us, I would say it’s the least satisfying rating that we have. It amounts to, “Eh, we don’t know.” It’s not entirely that, but it does provide some safe harbor because you’re not saying something is totally right or totally wrong. There are some things that come out that way. Maybe you’ve talked to all the experts, and they’re equally divided over the impact of something. One example is a lot of the gun claims we get that are based on data that are around twenty years old. You have to take that time into account. The numbers might be accurate as we know it, but a lot of time has passed, and it might have changed. There’s just nothing proving differently.
Yesterday, we gave Donald Trump a “mostly false” when he said that the media was even less popular than Congress. The truth of it is that it’s the opposite, it’s flipped. Congress is least popular in every poll we’ve found. But the media’s not far ahead, you know? His point, that the media is unpopular, was valid. So we gave him that one notch on the dial. There was a piece of what he said that was true.
“False” is simply inaccurate—it’s not true. The difference between that and “pants on fire” is that “pants on fire” is something that is utterly, ridiculously false. So it’s not just wrong, but almost like it’s egregiously wrong. It’s purposely wrong. Sometimes people just make mistakes, but sometimes they’re just off the deep end. That’s sort of where we are with “pants on fire.”
MM: I’ve noticed that one part of the project is keeping track of campaign promises by Obama and now Trump. Could you talk about how that got started, and how you keep track of all the promises on your lists?
AH: Absolutely, I’m glad you brought that up. Once Obama took office, as a fact-checking organization we said, “We should be tracking his campaign promises” because people care about that. When you think about voters, and what they’ll look back on when they think about reelection, they ask themselves: did they get done what they said they’d get done?
We engaged in a pretty massive effort to collect campaign process. We ultimately tracked, over two elections, 533 campaign promises from Barack Obama. Just last month we published a project on The Obameter. The URL is tampabay.com/obameter. We compiled the record we had for Obama on a bunch of issues overall. He did okay; I think about 48% of his promises were kept.
Over the course of this campaign, we were compiling promises made by Hillary Clinton and Donald Trump, so we would be ready to track their campaign promises too. We decided to be a little less granular this time, so we’re tracking 102 campaign promises from Trump. That’s our Trump-O-Meter. It’s live on our site, and we update it regularly. Recently he mentioned how well he’s doing on his campaign promises, and, yeah, he has a number of promises kept for what he said he’d do.
MM: I’m looking at the Trump-O-Meter right now. Of the 102 promises you’re tracking, which one are you most interested in seeing how it pans out in the future?
AH: If you go through the whole list, it’s pretty interesting. There are major ones—the ones you’d think of—like building the wall and repealing Obamacare, but there’s lots of other stuff in there about jobs and manufacturing. This one is interesting: Trump talks about clearing the internet of ISIS recruiting. Wow, that seems hard. I mean, it sounds like a good idea. But there are some things we wonder how we’ll measure.
There are also some relatively fun ones. It’s not all the super important stuff. Even for Obama, one of the campaign promises that we tracked was his saying he was going to get a dog for his daughters.
MM: Did he get the dog?
AH: He did. That was one of his early promises kept.
There are some of those for Trump that are a little more fun. One of them is that he won’t say “Happy Holidays.” He said, “If I become president, we’re going to say Merry Christmas at every store. You can leave ‘happy holidays’ at the corner. Other religions can do what they want.”
MM: I’m sure you get this question a lot, but what was it like covering the 2016 election from the perspective of someone who cares about the facts? I can imagine it being incredibly frustrating. Did you feel that people were listening to your reporting?
AH: My attitude toward it was entirely optimistic. Fact checking has never been as important or as widespread as it was in 2016. We have partners in about fourteen states that are fact checking. We’ve helped inspire a fact-checking movement across the world, where people are fact checking different countries. From my perspective, there’s every bit as much readership and interest and more in what we do. On the one hand, you’re like, “Wow, this is hard. It seems like some people might not care about the truth.” But that wasn’t the real impression we felt on a day-to-day basis, because: a) we’re fact checkers and b) we love what we do. We’re in it for the voters. That’s where the thrust of our attention comes from—our readers—who are nothing but happy. We did have a tremendous amount of traffic last year on our website.
MM: Out of curiosity, how much did your traffic go up in the past year?
AH: We set a goal of 100 million page views for the year, which is pretty ambitious, and we met that earlier than we thought we would. It was super exciting.
MM: So was Politifact one of the first fact checking websites to appear on the internet?
AH: There’s one other website called FactChecker.org that is our predecessor. But we came after them, and the Washington Post fact checker came after us. We were the first website to be awarded a Pulitzer Prize. That is our accomplishment.
MM: When I was doing some research on Politifact, I found a website dedicated to critiquing you guys. It’s PolitifactBias.com. What is your relationship with sources like these that accuse you of bias?
AH: We have no connection with the website. We’re well aware of it. We take accuracy very seriously. Transparency is one of the key things we focus on, which is why we publish all the sources for our fact checks. We flag every correction and have a subject tag called “correction,” so you can see every fact check we’ve put a correction on. So when people raise questions about our work, we’ll explore it and correct any errors, but sometimes the criticism is over the Truth-O-Meter ratings and the judgments we make. That’s nothing we’re going to apologize for because that’s our judgment. When the conflict is over that, they are not questioning our reporting or facts—“I wanted the mostly true, not the half true,” for example—and we say that the Truth-O-Meter is not meant to be a scientific instrument. It’s not this exact thing used to draw broad conclusions, and it’s not something everyone will agree on. That’s why we publish everything so transparently, so people can make their own judgments.
Something I wanted to mention was a tab on our website labeled “Global News.” We have a partnership with a fact-checking group in Africa called Africa Check and a grant from the Gates Foundation. We’ve been fact-checking global health and world development, for a little over a year now. There’s a lot in there about things that go on between Zika last year, then Ebola before that.
MM: Wow, I wasn’t aware of the global development aspect to your work. In addition to initiatives like these, could you describe the general direction the project has been moving over the past ten years?
AH: We’ve gotten much bigger. We started out as just a few people fact-checking the presidential campaigns. And it’s gotten bigger than that. We covered every debate, every convention—any sort of big issue, we’re fact checking. We have the ambition to go beyond that, too. We have this longstanding dream of fact checking the movies.
MM: You mean fact-checking historical movies?
AH: Yeah, mostly. Things that are meant to be true, like “Hidden Figures.” We’ve never gotten that off the ground, but our ambition is great. We’d like to be in all fifty states.
MM: How many [states] are you in right now?
AH: Fourteen. We’ve got room to grow. These are partnerships with news publications, radio stations—in California, there’s a radio station we’ve worked with. Politifact is a project of the Tampa Bay Times, which is a newspaper based out of St. Petersburg. We do a lot of our own work, but we’re all employees of the Times.
MM: Of course. Finally, I was wondering what the turnaround time is for a fact check. How long does it normally take to get something on the site? For example, fact checking a presidential debate would take how long?
AH: Debates are a little different because we’re sprinting. We also listen for things we know, so we can repurpose what we’ve already reported. To start something entirely new can take about one to two days. We’re actually in partnership with Facebook right now, and have been fact checking fake news—they have an algorithm where they share the most disputed things people are posting, and we fact check them. Questionable items are flagged when users try to share them. Anyway, those fake news stories can be done in a couple of hours, sometimes, because they’re just ridiculous. But as for the rest of them—most of our reporters are generalists, so there’s always a little bit of time bringing them up to speed. I’d say one to two days is pretty normal.