Mark Zuckerberg Is in Denial
Zeynep Tufekci NOV. 15, 2016
CHAPEL HILL, N.C. —
Donald J. Trump’s supporters were probably heartened in September, when,
according to an article shared nearly a million times on Facebook, the
candidate received an endorsement from Pope Francis. Their opinions on Hillary
Clinton may have soured even further after reading a Denver Guardian article
that also spread widely on Facebook, which reported days before the election
that an F.B.I. agent suspected of involvement in leaking Mrs. Clinton’s emails
was found dead in an apparent murder-suicide.
There is just one problem with these articles: They were
completely fake.
The pope, a vociferous advocate for refugees, never endorsed
anyone. The Denver Guardian doesn’t exist. Yet thanks to Facebook, both of
these articles were seen by potentially millions of people. Although
corrections also circulated on the social network, they barely registered
compared with the reach of the original fabrications.
This is not an anomaly: I encountered thousands of such fake
stories last year on social media — and so did American voters, 44 percent of
whom use Facebook to get news.
Mark Zuckerberg, Facebook’s chief, believes that it is “a
pretty crazy idea” that “fake news on Facebook, which is a very small amount of
content, influenced the election in any way.” In holding fast to the claim that
his company has little effect on how people make up their minds, Mr. Zuckerberg
is doing real damage to American democracy — and to the world.
He is also contradicting Facebook’s own research.
In 2010, researchers working with Facebook conducted an
experiment on 61 million users in the United States right before the midterm
elections. One group was shown a “go vote” message as a plain box, while
another group saw the same message with a tiny addition: thumbnail pictures of
their Facebook friends who had clicked on “I voted.” Using public voter rolls
to compare the groups after the election, the researchers concluded that the
second post had turned out hundreds of thousands of voters.
In 2012, Facebook researchers again secretly tweaked the
newsfeed for an experiment: Some people were shown slightly more positive
posts, while others were shown slightly more negative posts. Those shown more
upbeat posts in turn posted significantly more of their own upbeat posts; those
shown more downbeat posts responded in kind. Decades of other research concurs
that people are influenced by their peers and social networks.
All of this renders preposterous Mr. Zuckerberg’s claim that
Facebook, a major conduit for information in our society, has “no influence.”
The problem with Facebook’s influence on political discourse
is not limited to the dissemination of fake news. It’s also about echo
chambers. The company’s algorithm chooses which updates appear higher up in
users’ newsfeeds and which are buried. Humans already tend to cluster among
like-minded people and seek news that confirms their biases. Facebook’s
research shows that the company’s algorithm encourages this by somewhat
prioritizing updates that users find comforting.
I’ve seen this firsthand. While many of my Facebook friends
in the United States lean Democratic, I do have friends who voted for Mr.
Trump. But I had to go hunting for their posts because Facebook’s algorithm
almost never showed them to me; for whatever reason the algorithm wrongly
assumed that I wasn’t interested in their views.
Content geared toward these algorithmically fueled bubbles
is financially rewarding. That’s why YouTube has a similar feature in which it
recommends videos based on what a visitor has already watched.
It’s also why, according to a report in BuzzFeed News, a
bunch of young people in a town in Macedonia ran more than a hundred pro-Trump
websites full of fake news. Their fabricated article citing anonymous F.B.I. sources
claiming Hillary Clinton would be indicted, for example, got more than 140,000
shares on Facebook and may well have been viewed by millions of people since
each share is potentially seen by hundreds of users. Even if each view
generates only a fraction of a penny, that adds up to serious money.
Of course, fake news alone doesn’t explain the outcome of
this election. People vote the way they do for a variety of reasons, but their
information diet is a crucial part of the picture.
After the election, Mr. Zuckerberg claimed that the fake
news was a problem on “both sides” of the race. There are, of course, viral
fake anti-Trump memes, but reporters have found that the spread of false news
is far more common on the right than it is on the left.
The Macedonian teenagers found this, too. They had
experimented with left-leaning or pro-Bernie Sanders content, but gave up when
they found it wasn’t as reliable a source of income as pro-Trump content. But
even if Mr. Zuckerberg were right and fake news were equally popular on both
sides, it would still be a profound problem.
Only Facebook has the data that can exactly reveal how fake
news, hoaxes and misinformation spread, how much there is of it, who creates
and who reads it, and how much influence it may have. Unfortunately, Facebook
exercises complete control over access to this data by independent researchers.
It’s as if tobacco companies controlled access to all medical and hospital
records.
These are not easy problems to solve, but there is a lot
Facebook could do. When the company decided it wanted to reduce spam, it
established a policy that limited its spread. If Facebook had the same kind of
zeal about fake news, it could minimize its spread, too.
If anything, Facebook has been moving in the wrong
direction. It recently fired its (already too few) editors responsible for
weeding out fake news from its trending topics section. Unsurprisingly, the
section was then flooded with even more spurious articles
This June, just as the election season was gearing up,
Facebook tweaked its algorithm to play down posts from news outlets and to
increase updates shared by friends and family. The reasonable explanation is
that that’s what people want to see. Did this mean less reputable stories spread
quickly through social networks while real journalism got depressed? Only
Facebook knows. Worse, Facebook doesn’t flag or mark credible news websites:
The article from The Denver Guardian, a paper that doesn’t even exist, has the
same format on the platform as an article from The Denver Post, a real
newspaper.
In addition to doing more to weed out lies and false
propaganda, Facebook could tweak its algorithm so that it does less to
reinforce users’ existing beliefs, and more to present factual information.
This may seem difficult, but perhaps the Silicon Valley billionaires who helped
create this problem should take it on before setting out to colonize Mars.
Facebook should also allow truly independent researchers to
collaborate with its data team to understand and mitigate these problems. A
more balanced newsfeed might lead to less “engagement,” but Facebook, with a
market capitalization of more than $300 billion and no competitor in sight, can
afford this.
This should not be seen as a partisan issue. The spread of
false information online is corrosive for society at large. In a 2012 opinion
essay in The Times, I cited the Obama campaign’s successful social media and
data strategy to warn about the potential dangers of polarization and
distasteful political methods, like misinformation on social media.
And the dangers of Facebook’s current setup are not limited
to the United States. The effects can be even more calamitous in countries with
fewer checks and balances, and weaker institutions and independent media. In
Myanmar, for example, misinformation on Facebook has reportedly helped fuel
ethnic cleansing, creating an enormous refugee crisis.
Facebook may want to claim that it is remaining neutral, but
that is a false and dangerous stance. The company’s business model, algorithms
and policies entrench echo chambers and fuel the spread of misinformation.
Letting this stand is not neutrality; it amplifies the
dangerous currents roiling the world. When Facebook is discussed in tomorrow’s
history books, it will probably not be about its quarterly earnings reports and
stock options.
No comments:
Post a Comment