Fake news has been able to propagate, not simply because of people who spread what they knew to be fake, but because many (likely well-meaning) people bought it and shared it.
Which is why Facebook's response to this issue is so ridiculous:
Last week, Facebook said its News Feed would prioritize links from publications its users deemed "trustworthy" in an upcoming survey. Turns out that survey isn't a particularly lengthy or nuanced one. In fact, it's just two questions.That's right, Facebook intends to protect people from fake news by asking the very people who helped spread that news what sources they find trustworthy. Do you see the problem with this scenario? Because the leadership at Facebook certainly doesn't.
Here is Facebook's survey — in its entirety:
Do you recognize the following websites
How much do you trust each of these domains?
- A lot
- Not at all
A Facebook spokesperson confirmed this as the only version of the survey in use. They also confirmed that the questions were prepared by the company itself and not by an outside party.
Yesterday evening, I went to my first meeting of an advisory board for an applied psychology accelerated bachelors program for adult learners. During that meeting, we were asked what skills and knowledge would be essential for someone coming out of such a program. One of my responses was that, a skillset from my training I've had to use in every job and many of volunteer experiences has to do with creating and fielding surveys. There is an art and a science to surveying people, and there are ways to write questions that will get useful data - and ways that will get you garbage. Facebook's survey is going to give them garbage.
Even if you forget about the countless people who, every day, mistake well-known satirical news sites (like the Onion) as genuine, not every site is clear on whether it is purporting itself to be real news or simply entertainment - and let's be honest, where do you draw that line between informing and entertaining? How do you define something as trustworthy or not? And how might variation in how people define that term influence your data? Many year ago, when Jon Stewart was still on The Daily Show, I remember a commercial in which they shared that more Americans get their news from The Daily Show than anywhere else, to which Stewart replied, "Don't do that! We make stuff up!" Even though they were forthcoming about this, people still considered them trustworthy.
The real issue is when people can't tell the difference. So now you're fixing a problem caused by people being unable to tell the difference by asking people to tell the difference. At best, the survey will produce such inconsistent data, it won't have any influence on what links can and can't be shared. At worst, the same biases that caused fake news to be shared to begin with will be used to deem sites trustworthy or not. And having the Facebook stamp of trustworthy could result in even more harm.
Honestly, information campaigns to make people more skeptical would be a much better use of Facebook's time and resources.