Summary: Nope. That’s way overstating the impact of social networking sites, and way understating the intelligence of the people who use them. The biggest problem: generating connections and empathy between disparate users.
We don’t live in a world were we are exposed solely to the information cocoon of our Facebook friends, unless you never leave your house, turn on the TV, or talk to other human beings. For instance, I doubt very much that people widely believed that Hillary Clinton was running a pedophile ring or that the Pope endorsed Donald Trump. If anything, so called “fake news stories” (read “lies”) nudge us in the same fashion as the National Inquirer, a publication no one believes (except for Trump); if you don’t like the person to begin with, you maybe like him/her fractionally less after reading “fake news.”
“But,” you say, “the BuzzFeed story said that anywhere from 20 to 38 percent of news on Facebook was ‘fake’ in one way or another.” Agreed, but here’s a rebuttal: people get hundreds of emails a year advertising “Canadian Pharmaceuticals” and “Sexy Asian Singles In Your Area” and “Why Global Warming is Fake,” but I doubt many are seriously investigating these possibilities. Just because someone clicks on something doesn’t mean they accept it as the truth, and it doesn’t mean they’re a rube who can’t tell the difference between bullshit and reality.
Facebook has a couple of problems on their plate right now. Problem one goes like this: they’re showing us things that we want to see in order to get us to click on more shit at the expense of showing us things we should probably know but maybe don’t want to see. That problem is bad, but not as bad as problem two: we aren’t learning very much about each other as a result of social networking sites, suggesting that these sites build little to no consensus or empathy. Rather than shit kick FB for failing to manually disambiguate so called “fake news” (we were all screaming for algorithms to control Trending Topics six months ago), we should think about the real problems posed by social networking sites, mainly that they aren’t doing a very good job of connecting us as a country.
The underlying technological issue behind problem one presents itself if, by chance or choice, you click on a “fake news” story (or any other link, for that matter). Upon returning to your feed, you will be presented with the lamentable “People Also Shared” option that force feeds you more of the same (which you presumably click on). That leads to the inevitable worry: “If you see an argument enough, it starts to look true.” That’s a problem, but it’s a social problem not inherent to Facebook.
Rather than talk about an article on FB, we’re all more apt to fall into the “spiral of silence,” where we use the site to post news articles to an audience of our choosing (FB friends) that are representative of what we think, but we don’t particularly want to have a debate. Nor do most people actively seek out their ideological opposites to get their opinions. Instead, if something is possibly going to cause offense or even trigger a negative comment, we self censor. Hence, we end up reading a lot of repetitive things that possibly influence our thinking, but we don’t share our opinions (e.g. the so called “white silence”). In that way, problem one can cause a chilling effect, but I don’t see Facebook fixing that one by adding a self-flagellation icon for staying silent on a social issue, so I’ll defer that to the end of this post where I talk about hard things to fix.
What about the “if you see it enough it becomes true” argument? The underlying problem is that many of us are not trained or under-equipped to critically evaluate information we encounter on the internet. When we search, we look at the first page of results (at best), and if they conform to our assumptions, we accept it despite knowing that search results = algorithm + harvested personal data. When we look at product reviews, we go for the “most helpful,” ignoring potential manipulations from strategies like “sock puppeting” or “astroturfing” (phony reviews or comments made by the person hocking the product). We miss a lot of information that could otherwise be very useful. Also, search activities = relative individual worth + available time.
As a side note, I have observed one activity that transforms an average citizen into the grittiest of investigative reporters: saving ten bucks on a hotel. Maybe we should fine people ten bucks for re-posting a “fake news” article (sadly, it would put The Onion out of business, although by the above logic that may be a valuable service, especially for those confused over whether Kim Jung Un is the sexiest man alive).
I saw a suggestion that Facebook should discontinue the news feed and change the formatting on bogus stories, presumably to make them look “faker” than “real news” stories. I’ll be damned if I can tell the difference between the fake celebrity magazines and the real ones at the grocery store, but I just assume they’re all horseshit and proceed about my business.
No, it’s not Facebook’s job to search out “fake news” and reformat it. Nor is it their job to teach us to critically evaluate an article entitled “The Satanic Connection Hillary Clinton Doesn’t Want Anyone to Talk About” (actual website, by the way). It’s up to our educators to do a better job training the next generation to be skeptical about what they read, especially when it’s only what they want to hear. We’ve got a lot of work to do, but in my experience as an educator, young folks are doing a lot better than the OG in terms of filtering out the bullshit on the internet.
Problem two is the underlying social and technical problem that will consume the next decade: How do we get people to connect to people outside of their own social circle, and how do we teach people to voice their opinions even when they are unpopular? I’m not sure you can change a platform like Facebook to make that happen, but I have some suggestions:
- Optionally suggest a friend connection with one random person per month. You don’t need to force it on people, but put in a suggestion other than Mark Zuckerberg (I mean, Jesus, how many friends do you want, Mark?).
- Promote some random posts into the news feed. Maybe even make them bulletproof by stripping the name of the person posting. Just let people see what others are thinking.
- Instead of promoting narcissistic behaviors (selfies, FB Live, etc.) through the design of the platform, find a way to use it to build connections and empathy, maybe even hooking people up who want to debate issues and adding a moderation feature.
Every social networking site can’t do everything, so you can’t just kluge together a bunch of other features, but you can promote more inclusive behaviors among members instead of endlessly remixing content in their own personal information cocoons. Isolation leads to polarization, and polarization leads to a loss of empathy. A path to the dark side that is.
To conclude, I’m usually the first to kick the shit out of Facebook for everything, but let’s stop it with the whole “Facebook cost Hillary the election.” Just like our false belief in polling and statistics combined with our inability to connect and empathize with our fellow citizens, Facebook is one problem among many.