“What we’re talking about here is a means of mind control on
a massive scale that there is no precedent for in human
history.” That may sound hyperbolic, but Robert Epstein says
it’s not an exaggeration. Epstein, a research psychologist at
the American Institute for Behavioral Research in Vista,
California, has found that the higher a politician ranks on a
page of Internet search results, the more likely you are to
vote for them.
“I have a lot of faith in the methods they’ve used, and I
think it’s a very rigorously conducted study,” says Nicholas
Diakopoulos, a computer scientist at the University of
Maryland, College Park, who was not involved in the research.
“I don’t think that they’ve overstated their claims.”
In their first experiment, Epstein and colleagues recruited
three groups of 102 volunteers in San Diego, California, who
were generally representative of the U.S. voting population in
terms of age, race, political affiliation, and other traits.
The researchers wanted to know if they could influence who the
Californians would have voted for in the 2010 election … for
prime minister of Australia.
So they built a fake search engine called Kadoodle that
returned a list of 30 websites for the finalist candidates, 15
for Tony Abbott and 15 for Julia Gillard. Most of the
Californians knew little about either candidate before the
test began, so the experiment was their only real exposure to
Australian politics. What they didn’t know was that the search
engine had been rigged to display the results in an order
biased toward one candidate or the other. For example, in the
most extreme scenario, a subject would see 15 webpages with
information about Gillard’s platform and objectives followed
by 15 similar results for Abbott.
As predicted, subjects spent far more time reading Web pages
near the top of the list. But what surprised researchers was
the difference those rankings made: Biased search results
increased the number of undecided voters choosing the favored
candidate by 48% compared with a control group that saw an
equal mix of both candidates throughout the list. Very few
subjects noticed they were being manipulated, but those who
did were actually more likely to vote in line with
the biased results. “We expect the search engine to be making
wise choices,” Epstein says. “What they’re saying is, ‘Well
yes, I see the bias and that’s telling me … the search engine
is doing its job.’”
In a second experiment, the scientists repeated the first
test on 2100 participants recruited online through Amazon’s
labor crowdsourcing site Mechanical Turk. The subjects were
also chosen to be representative of the U.S. voting
population. The large sample size—and additional details
provided by users—allowed the researchers to pinpoint which
demographics were most vulnerable to search engine
manipulation: Divorcees, Republicans, and subjects who
reported low familiarity with the candidates were among the
easiest groups to influence, whereas participants who were
better informed, married, or reported an annual household
income between $40,000 and $50,000 were harder to sway.
Moderate Republicans were the most susceptible of any group:
The manipulated search results increased the number of
undecided voters who said they would choose the favored
candidate by 80%.
“In a two-person race, a candidate can only count on getting
half of the uncommitted votes, which is worthless. With the
help of biased search rankings, a candidate might be able to
get 90% of the uncommitted votes [in select demographics],”
In a third experiment, the team tested its hypothesis in a
real, ongoing election: the 2014 general election in India.
After recruiting a sample of 2150 undecided Indian voters, the
researchers repeated the original experiment, replacing the
Australian candidates with the three Indian politicians who
were actually running at the time. The results of the real
world trial were slightly less dramatic—an outcome that
researchers attribute to voters’ higher familiarity with the
candidates. But merely changing which candidate appeared
higher in the results still increased the number of undecided
Indian voters who would vote for that candidate by 12% or more
compared with controls. And once again, awareness of the
manipulation enhanced the effect.
A few percentage points here and there may seem meager, but
the authors point out that elections are often won by margins
smaller than 1%. If 80% of eligible voters have Internet
access and 10% of them are undecided, the
search engine effect could convince an additional 25% of
those undecided to vote for a target candidate, the team
reports online this week in the Proceedings of the
National Academy of Sciences. That type of swing would
determine the election outcome, as long as the expected win
margin was 2% or less. “This is a huge effect,” Epstein says.
“It’s so big that it’s quite dangerous.”
But perhaps the most concerning aspect of the findings is
that a search engine doesn’t even have to intentionally
manipulate the order of results for this effect to manifest.
Organic search algorithms already in place naturally put one
candidate’s name higher on the list than others. This is based
on factors like “relevance” and “credibility” (terms that are
closely guarded by developers at Google and other major search
engines). So the public is already being influenced by the
search engine manipulation effect, Epstein says. “Without any
intervention by anyone working at Google, it means that
Google’s algorithm has been determining the outcome of close
elections around the world.”
Presumably Google isn’t intentionally tweaking its algorithms
to favor certain presidential candidates, but Epstein says it
would extremely difficult to tell if it were. He also points
out that the Internet mogul will benefit more from certain
election outcomes than others.
And according to Epstein, Google is very aware both of the
power it wields, as well as the research his team is doing:
When the team recruited volunteers from the Internet in the
second experiment, two of the IP addresses came from Google’s
head office, he says.
“It’s easy to point the finger at the algorithm because it’s
this supposedly inert thing, but there are a lot of people
behind the algorithm,” Diakopoulos says. “I think that it does
pose a threat to the legitimacy of the democracy that we have.
We desperately need to have a public conversation about the
role of these systems in the democratic processes.”