Five years ago this month, CNN started an initiative called iReport, where regular people could generate news content. Though visitors to CNN.com can access these iReports via the iReport page, CNN will also occasionally place iReport stories on their main homepage as well. In a recent story covering what the iReport news team consider to be the five stories that defined their "year one", many commenters attacked the initiative, saying iReport stories were "non news-worthy", and "lack the credibility and the training to hold such an important place on the website". Others posted support for iReports, saying that these "iReporters" put more efforts into fact-checking and proofreading than regular CNN reporters. This notion of citizen-journalists is definitely very interesting, and could be the starting point for many interesting debates. For another day, I suppose. What I'd really like to delve into today is this idea about credibility: what determines whether someone is viewed as credible and what biases may influence that judgment?
Human beings are regularly called upon to process large amounts of information from the world around us. This information is perceived by our five senses, and the information has to be interpreted by our brain in order for us to make decisions and navigate our environment. We're very good at processing this information. We're so good in fact that people are always looking for supernatural explanations for why we're so good at it - I know a lot of people who believe in ESP, but honestly, I think that people are just very good at picking up on cues in the environment (and some people are better and faster) and that because of their skill and speed, it appears they reacted before whatever they were reacting to even happened. Once again, another post, another day.
That's not to say that biases can't be introduced into our processing. We often see what we want to see and hear what we want to hear. It's true that our current experiences are often colored by our past experiences, which I think is just another example of how awesome our brain is; not only do we process what's directly in front of us, we are processing past experiences in tandem and drawing connections between the two. This means, however, that we can make mistakes.
Want to learn more about some of these "mistakes"? Check out this list of cognitive biases.
Of course, just because we're capable of systematic processing doesn't mean we will always be raring to think things through at that level all the time. On the contrary, thinking through everything at this level would lead us to become quickly overwhelmed. Therefore, we've created mental shortcuts that help us to navigate our environment while saving our cognitive resources for the things that really matter; Susan Fiske and Shelly Taylor referred to this tendency, calling us "cognitive misers". It is important to note, however, that some people really enjoy thinking and engage in it much more than the average person; we would say they have a "high need for cognition". Even these people will occasionally engage in mental shortcuts, they just do so less frequently than the average person.
One mental shortcut is called a heuristic. Heuristics are quick rules or devices, which allow someone to draw a conclusion or solve a problem quickly and without a great deal of thought. Heuristics are not always wrong. For example, if you see someone standing on a busy street carrying a map, you can probably safely assume they are a tourist. True, you could cycle through all the possible reasons a person would be carrying a map, but the cognitive miser in you will probably just think, "Tourist", and move on (or if you're feeling altruistic, offer to help said tourist find his or her next destination). The problem is when we use these shortcuts in situations where more systematic thought is necessary.
For example, imagine you've been called for jury duty. You're presented with lots of competing evidence, and you have to use that information to determine whether the defendant is guilty or not guilty. Now would probably be a bad time to use a heuristic, but some research evidences suggest that people do use heuristics in these situations, especially when the evidence is confusing or points equally to guilt or innocence. One heuristic is referred to as "what is beautiful is good"; it is a belief that, if a defendant is physically attractive, he or she must be innocent.
Another well-known heuristic is called the "availability heuristic". When we believe something is true - such as, "Women are bad at telling jokes" - we can easily think of many examples that support our conclusion, but often have difficulty remembering examples that refute the conclusions - such as times when women showed excellent comedic skills or times when men showed poor skills. Based on having many available examples that support our conclusion, we develop even stronger beliefs that our initial conclusion is correct.
The heuristic that is probably operating here deals with source credibility. When someone is presenting information, and especially, trying to convince you of something, you could 1) listen to her arguments and systematically think through them to determine if her conclusions are valid, or 2) decide whether you agree with her based on her education and/or title. Research shows that arguments presented by someone with, for example, a PhD are judged to be more sound than the same arguments presented by, say, a high school student. True, when studies manipulate how good the arguments really are, people are generally able to differentiate good arguments from bad arguments, but even then, the PhD often still has an advantage over Joe H.S. Student.
I don't often read the iReport stories, so I don't really have any conclusions about whether there is any truth to claims of inaccuracy. It stands to reason, however, that some of these commenters are reacting to job title rather than content. An iReporter could be anybody. CNN does provide information on their vetting process - how they determine whether an iReport is accurate and worthy of being called news - but someone who is applying a heuristic to determine whether to believe this particular story may not be very motivated to read about the vetting process and determine whether that raises their estimation of an iReporter's credibility.
That’s not to say that job title is the only marker of credibility people use. For instance, Elizabeth Smart, whose kidnapping case received widespread media attention, is now a contributor to ABC News on stories involving kidnappings. ABC News clearly feels this experience qualified her to talk about kidnapping in general (and many viewers likely do, as well). Of course, her hiring did prompt the Daily Beast to ask, “Other than fame – as the victim of a horrifying crime – what exactly are her qualifications?” True, the qualifications they would like see, such as a degree in psychology or years of experience in studying kidnapping victims, could be considered a metric of knowledge – we assume someone which such training to have a lot of knowledge on the topic, but this is not always the case; once again, this is just a heuristic at work.
Heuristics are not likely to go away. Despite their flaws, these shortcuts are necessary. Imagine if every decision you made (paper or plastic, boxers or briefs, soup or salad) involved the same careful thought you currently reserve for the important decisions (city or suburbs, car or SUV, Cubs or White Sox). So what is the best way to deal with this dilemma of credibility? Other than what CNN currently does – provide a document detailing the vetting process – what could they do to set your mind at ease?