Could Google pick the next president?
Google’s motto is “Don’t be evil.” But what would it mean for democracy if it was?
That’s the question psychologist Robert Epstein has been asking in a series of experiments testing the impact of a fictitious search engine – he called it “Kadoodle” – that manipulated search rankings, giving an edge to a favored political candidate by pushing up flattering links and pushing down unflattering ones.
Not only could Kadoodle sway the outcome of close elections, he says, it could do so in a way most voters would never notice.
Epstein, who had a public spat with Google last year, offers no evidence of actual evil acts by the company. Yet his exploration of Kadoodle – think of it as the equivalent of Evil Spock, complete with goatee – not only illuminates how search engines shape individual choices but asks whether the government should have a role in keeping this power in check.
“They have a tool far more powerful than an endorsement or a donation to affect the outcome,” Epstein said. “You have a tool for shaping government. . . . It’s a huge effect that’s basically undetectable.”
There is no reason to believe that Google would manipulate politically sensitive search results. The company depends on its reputation for presenting fair, useful links, and though that image has taken some hits in recent years with high–profile investigations in the United States and Europe, it would be far worse to get caught trying to distort search results for political ends.
Yet Epstein’s core finding – that a dominant search engine could alter perceptions of candidates in close elections – has substantial support. Given the wealth of information available about Internet users, a search engine could even tailor results for certain groups, based on location, age, income level, past searches, Web browsing history or other factors.
The voters least tuned in to other sources of information, such as news reports or campaign advertisements, would be most vulnerable. These are the same people who often end up in the crucial middle of American politics as coveted swing voters.
“Elections are won among low–information voters,” said Eli Pariser, former president of MoveOn.org and the author of “The Filter Bubble: What the Internet Is Hiding From You.” “The ability to raise a negative story about a candidate to a voter . . . could be quite powerful.”
Even efforts to refine search algorithms, he said, can unintentionally affect what voters see on their results pages. A search engine that favorscertain news sources – based, for example, on the sophistication of the writing as measured by vocabulary or sentence length – might push to prominence links preferred by highly educated readers, helping the political party and ideas they support.
Epstein’s research is slated to be presented in Washington this spring at the annual meeting of the Association for Psychological Science. The Washington Post shared an advance copy of a five–page research summary with officials at Google. “Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning,” the company said in a statement. “It would undermine people’s trust in our results and company if we were to change course.”
It certainly is clear that outside groups seek to manipulate Google’s results. The consequences of such tactics in the consumer world are well–known, with companies spending vast sums trying to goose search rankings for their products in make–or–break bids for profit.
In the political realm, the creators of “Google bombs” managed to link the name of then–Sen. John Kerry, the Democratic presidential nominee in 2004, with the word “waffles” in search results. President George W. Bush had his name linked, through similar tactics, to the words “miserable failure.” In 2010, a conservative group used a collection of linked Twitter accounts to affect search rankings about the Massachusetts special election that brought Scott Brown to the Senate, according to research by two computer science professors at Wellesley College.
Google has resisted such tactics, and its vulnerability to manipulation from outside was limited in the 2012 election cycle, according to researchers, political professionals and search experts.
Though search results on Google are generated by a complex and ever–changing algorithm – weighing, for example, links to other sites, content quality and the time spent on sites when people click through – the key factors emphasize relevance to users. The company works to spot and defeat those who seek to alter results unfairly,
and it sometimes punishes those who do by demoting their search rankings.
But Epstein’s argument is based on a different scenario: What if manipulation came from within?
Even those who harbor no doubts about Google’s intentions generally agree that internal manipulation would be potent and, at least initially, hard to spot.
“They could do something manually with these results, but I can’t see why they would do that,” said Mike Grehan, publisher of Search Engine Watch and a commentator whose views often are in line with Google’s.
Yet Epstein and some others say the company’s power alone – whether or not it uses it – calls out for legal safeguards. Though Microsoft, Yahoo and Facebook also operate search engines, Google has about two–thirds of the U.S. market.
Even if Google has no plan to skew search rankings today, what if conditions – or its corporate leadership – changed over time?
“There is a history of powerful communications companies directly meddling in elections. I don’t think Google has an incentive to do this, but a future Google could,” said Tim Wu, a Columbia University law professor and the author of “The Master Switch: The Rise and Fall of Information Empires.” “The question of free speech in America is controlled by a few powerful gatekeepers who could subtly shape things.”
In the 1800s, Wu noted, Western Union employees often read telegrams from Democrats and shared their contents with Republicans – their political allies – or didn’t deliver them. This stopped, Wu said, only with the arrival of forceful federal regulation.
Epstein, a Harvard–trained psychologist and former editor in chief of Psychology Today, turned his attention to Google after the company flagged search results for a Web site that he ran, warning that it was infected with malicious programs that could harm visitors.
Epstein complained publicly about the move and the lack of responsiveness from Google, e–mailing senior company officials.He later acknowledged that his site had been infiltrated by hackers, but the experience left him aghast at what he considered Google’s unchecked power. He wrote blog posts calling for greater regulatory oversight of the company.
For his experiment, conducted with colleague Ronald E. Robertson at the American Institute for Behavioral Research and Technology, Epstein attempted to shape the perceptions of a random sampling of potential voters in California. The test involved an election most of the subjects knew little about: a close–fought campaign for prime minister of Australia in 2010. The researchers secretly altered the rankings of search results to help favored candidates.
After 15 minutes of searching and reading linked articles, it was clear that the manipulation had worked, with about 65 percent of subjects favoring the candidate getting elevated rankings, compared with 50 percent among a control group that saw impartial search results, according to Epstein. Three out of four subjects, meanwhile, reported no awareness that the search rankings had been altered.
The lack of prior knowledge about the race or alternative sources of information accentuated the effects of the search rankings, Epstein acknowledged. But he said the experiment made clear that manipulation is possible, powerful and hard to detect.
However, the sheer volume of other information available to voters would make such manipulation hard to execute, said David Vladeck, a Georgetown University law professor and the former head of consumer protection at the Federal Trade Commission. Traditional news organizations, he said, probably have more power over the views of voters.
“It is not clear to me that, even if Google tried to, it could exercise the same power over the American public as Fox News or MSNBC,” Vladeck said. “The claim is such a difficult one to sustain that I find it hard to take it seriously.”
Federal regulations have in some circumstances limited what news organizations can do. The Fairness Doctrine once required broadcasters to present both sides of controversial issues, and media cross–ownership rules can still limit the ability of newspapers, for example, to own radio or television stations in the same metropolitan area.
Some legal scholars contend that search engine rankings are covered under the First Amendment’s free speech protections. Yet, even those who think that search engines can have potent effects on elections differ on what kind of regulation, if any, would be sensible and effective. And it’s not even clear what federal agency would have the authority to investigate allegations of abuse.
The key lesson may be that search engines are not mere machines spitting out perfectly impartial results. They are driven by decisions, made by people who have biases. This does not necessarily make them evil – merely human.
“The more trust we give to these kinds of tools, the more likely we can be manipulated down the road,” said Panagiotis T. Metaxas, one of the computer science professors at Wellesley College who studied the Massachusetts election. “We need to understand, as people, as citizens, why we believe what we believe.”