Today, I met with Sarah Stuteville, Lecturer in Communications at University of Washington, and Co-founder/Editor of the Common Language Project (CLP). The CLP is a nonprofit, multimedia journalism organization that focuses on international reporting, local reporting in the Puget Sound region, and journalism in education. I asked for Stuteville's advice about the direction of Malark-O-Meter because CLP has a strategic relationship with the University of Washington that I would like Malark-O-Meter (or whatever it becomes) to mirror.
So the questions I had for Sarah were understandably about how to build an organization that the UW would want to team up with. Based on Sarah's description of the history of CLP's relationship with UW, I gleaned the following valuable pieces of advice.
To attract university paternship, I need to develop credibility and get some independent funding. Makes sense. I also should be sure to have a set of deliverables that the UW would see as providing it with return on investment of resources and time. Also makes sense.
Furthermore, I should be sure that what Malark-O-Meter's deliverables engage not only faculty, but students. One of CLP's strengths is that its multimedia journalism activities are fully embedded into the curriculum of the Department of Communications. I explained to Sarah an idea I have that would engage students in fact checking activities, which would prime crowd-sourced data collection instrument that I could use to comparatively assess the supposed biases of professional fact checkers relative to nonprofessionals.
Sarah also provided me with an awesome list of highly relevant contacts to reach in the meantime. I cannot wait to meet some of the people whom she mentioned.
Anyway, thanks to Sarah Stuteville for agreeing to a helpful meeting.
This week, two political science blog posts about the difference between political engagement and factual understanding stood out to Malark-O-Meter. (Thanks to The Monkey Cage for Tweeting their links.) First, there's Brendan Nyhan 's article at YouGov about how political knowledge doesn't guard against belief in conspiracy theories . Second, there's voteview 's article about . (Side note: This could be the Golden Era of political science blogging) These posts stand out both as cautionary tales about what it means to be politically engaged versus factual, and as promising clues about how to assess the potential biases of professional fact checkers in order to facilitate the creation of better factuality metrics (what Malark-O-Meter is all about).
Let's start with Nyhan's disturbing look at the interactive effect of partisan bias and political knowledge on belief in the conspiracy theory that the 2012 unemployment rate numbers were manipulated for political reasons. The following pair of plots (reproduced from the original article) pretty much says it all.
First, there's the comparison of Dem, Indie, and GOP perception of whether unemployment statistics are accurate, grouped by party affiliation and low, medium, and high scores on a ten-question quiz on political knowledge.
Republicans and maybe Independents with greater political knowledge perceive the unemployment statistics to be less accurate.
Here's a similar plot showing the percent in each political knowledge and party affiliation group that believe in the conspiracy theory about the September unemployment statistic report.
Democrats appear less likely to believe the conspiracy theory the more knowledgeable they are. Republicans with greater political knowledge are more likely to believe the conspiracy theory. There's no clear effect among Independents. What's going on?
Perhaps the more knowledgeable individuals are also more politically motivated, and so is their reasoning. It just so happens that motivated reasoning in this case probably errs on the side of the politically knowledgeable Democrats.
Before discussing what this means for fact checkers and factuality metrics, let's look at what voteview writes about an aggregate answer to a different question, posed by Gallup (aka, the new whipping boy of the poll aggregators) about the June jobs report.
Click to enlarge.
In case you haven't figured it out, you're looking at yet another picture of motivated reasoning at work (or is it play?). Democrats were more likely than Republicans to see the jobs report as mixed or positive, whereas Republicans were more likely than Democrats to see it as negative. You might expect this effect to shrink among individuals who say they pay very close attention to news about the report because, you know, they're more knowledgeable and they really think about the issues and... NOPE!
Click to enlarge.
The more people say they pay attention to the news, the more motivated their reasoning appears to be.
What's happening here? In Nyhan's study, are the more knowledgeable people trying to skew the results of the survey to make it seem like more people believe or don't believe in the conspiracy theory? In the Gallup poll, is "paid very close attention to news about the report" code for "watched a lot of MSNBC/Fox News"? Or is it an effect similar to what we see among educated people who tend to believe that vaccinations are (on net) bad for their children despite lots and lots of evidence to the contrary? That is, do knowledgeable people know enough to be dangerous(ly stupid)?
I honestly don't know what's happening, but I do have an idea about what this might mean for the measurement of potential act checker bias to aid the creation of better factuality metrics and fact checking methods. I think we can all agree that fact checkers are knowledgeable people. The question is, does their political knowledge and engagement have the same effect on their fact checking as its does on the perceptions educated non-fact-checkers? If so, is the effect as strong?
I've mentioned before that a step toward better fact checking is to measure the potential effect of political bias on both the perception of fact and the rulings of fact checkers. Basically, give individuals a questionnaire that assesses their political beliefs, and see how they proceed to judge the factuality of statements made by individuals of known party affiliations, ethnicity, et cetera. To see if fact checking improves upon the motivated reasoning of non-professionals, compare the strength of political biases on the fact checking of professionals versus non-professionals.
What these two blog posts tell me is that, when drawing such comparisons, I should take into account not only the political affiliation of the non-professionals, not only the political knowledge of the non-professionals, but the interaction of those two variables. Then, we can check which subgroup of non-professionals the professional fact checkers are most similar to, allowing us to make inferences about whether professional fact checkers suffer from the same affliction of motivated reasoning that the supposedly knowledgeable non-professionals suffer from.
Recently, the Nieman Journalism Lab reported on OpenCaptions , the creation of Dan " Truth Goggles " Schultz. OpenCaptions prepares a live television transcript from closed captions, which can then be analyzed. I came across OpenCaptions back in October, when I learned about Schultz's work on Truth Goggles, which highlights web content that has been fact checked by PolitiFact . Reading about it this time reminded me of something I'd written in .
At the end of that post, I commented on a suggestion made by Kathleen Hall Jamieson of the Annenberg Public Policy Center about how to measure the volume of factuality that a politician pumps into the mediasphere. Jamieson's suggestion was to weight the claims that a politician makes by the size of their audience. I pointed out some weaknesses of this factuality metric. I also recognized that it is still useful, and described the data infrastructure necessary to calculate the metric. Basically, you need to know the size of the audience of a political broadcast (say, a political advertisement), the content of the broadcast, and the soundness of the arguments made during the broadcast.
OpenCaptions shows promise as a way to collect the content of political broadcasts and publish it to the web for shared analysis. Cheers to Dan Schultz for creating yet another application that will probably be part of the future of journalism...and fact checking...and factuality metrics.