WWS Reacts: How Facebook Influenced the US Presidential Election

Nov 22 2016
By B. Rose Kelly
Source Woodrow Wilson School

Facebook — which has been raked over the coals for allowing the dissemination of fake news stories about the 2016 election — is now taking a hard and careful look at what role it may have played among American voters. Some say the election results illustrate a social media “echo chamber,” in which users gather news and information primarily from friends with shared interests.

What are the effects of these echo chambers and silos? Are there ways to spread fact-checked news stories? And what responsibility does Facebook have to its users?

Katherine Haenschen, a postdoctoral research associate at Princeton University’s Center for Information Technology Policy, answers these questions in the Q&A that follows. Haenschen’s work focuses on the intersection of digital media and political participation. Her past work has demonstrated the ability of Facebook status updates and emails to increase voter turnout.

Issues like these will be discussed at the Princeton-Fung Global Forum, “Society 3.0+: Can Liberty Survive the Digital Age?” in Berlin on March 20-21, 2017. Registration is now open to the public.

Q. What are the effects of the “echo chamber” on social media — or only seeing stories and information posted by your friends?

Haenschen: It is important to remember that while most personal networks tend to be like-minded, they're not completely politically homogeneous. People encounter divergent viewpoints and news articles on Facebook and in real life all the time, both intentionally and inadvertently. This kind of exposure is most likely to come from so-called "weak ties," or the people that we're not as close to or in as frequent of contact with. Facebook actually enables the preservation of these ties, so, ironically, the platform offers people a good chance to be exposed to divergent views.

Generally speaking, research on networks shows that people who have more like-minded discussion partners are more likely to participate politically. So exposure to this kind of information environment isn't in and of itself problematic. The bigger question is what is circulating in those networks, namely the rise of intentional misinformation designed to either generate revenue through clicks or mislead the public.

Q. Facebook has been accused of spreading fake news through news sources that aren’t legitimate. Does Facebook have a moral responsibility to police this?

Haenschen: To be clear, Facebook itself is not initiating the sharing of fake news. If someone sees an intentionally misleading article in their newsfeed, it is because they either like the page that shared it or are friends with someone who shared it.

Much of the discussion about fake news takes human agency out of the equation. It's a two-way street; if people see someone sharing fake news, they should comment on it and post factual information. It may not influence the person who shared it, but it can have an impact on other people who can see it.

Q. Did Facebook have any positive effects on voter turnout among the country’s populations? What does the data show?

Haenschen: At this point it's too soon to know if Facebook was running any internal experiments to boost turnout or even if seeing friends post "I Voted!" pictures and status updates had an effect. However, in past years, some of these mechanisms have increased turnout in specific contexts.

I encourage people to look beyond what Facebook itself is doing and think about it as a tool that can be harnessed by citizens to increase turnout in our own networks. Don't wait for Facebook to do something. Instead, use its tools to tag people in voting reminders and send messages urging people to vote. 

Q. If you were Mark Zuckerberg, how would you describe the purpose of your company, and would you do more than he is currently doing to reverse this growing trend? If so, how would you change Facebook?

Haenschen: Mark Zuckerberg seems to view Facebook as a platform or conduit while overlooking its potential for harm to our information environment. He may not want to be the “arbiter of truth,” but choosing not to act is in itself a choice.

If I were Zuckerberg, I would think about the role that I wanted my platform to play in providing Americans with information. An informed citizenry is a necessary but not sufficient component of a democratic society. Facebook can't ignore the impact and reach of the information shared by its users. For example, removing the human editors that selected trending news led to the proliferation of obviously false stories on its homepage for millions of users. That should be embarrassing for the company.

One challenge for Facebook is that its business model incentivizes web publishers to use inflammatory or misleading headlines because they generate clicks and revenue for both the publisher and the platform (otherwise known as “clickbait.”) The "fake news" outfits took this structural problem to its logical extreme: They published knowingly false information to make money off of credulous supporters of either candidate — with great success, I might add.

Ultimately, Facebook — and Twitter, Instagram and any other social platform — can set its own rules of the road regarding fake news, harassment or any other form of social harm. They need to look beyond quarterly earnings reports and think about what is in the best interest of our democratic society.