Silicon Valley is giving white supremacy groups the boot
Silicon Valley significantly escalated its war on white supremacy this week, choking off the ability of hate groups to raise money online, removing them from Internet search engines, and preventing some sites from registering at all.
The new moves go beyond censoring individual stories or posts. Tech companies such as Google, GoDaddy and PayPal are now reversing their hands-off approach about content supported by their services and making it much more difficult for "alt-right" organizations to reach mass audiences.
But the actions are also heightening concerns over how tech companies are becoming the arbiters of free speech in America. And in response, right-wing technologists are building parallel digital services that cater to their own movement.
Gab.ai, a social network for promoting free speech, was founded in August 2016 by Silicon Valley engineers alienated by the region's liberalism. Other conservatives have founded Infogalactic, a Wikipedia for the alt-right, as well as crowdfunding tools Hatreon and WeSearchr. The latter was used to raise money for James Damore, a white engineer who was fired after criticizing Google's diversity policy.
"If there needs to be two versions of the Internet so be it," Gab.ai tweeted Wednesday morning. The company's spokesman, Utsav Sanduja, later warned of a "revolt" in Silicon Valley against the way tech companies are trying control the national debate.
"There will another type of Internet who is run by people politically incorrect, populist, and conservative," Sanduja said.
Some adherents to the alt-right - a fractious coalition of neo-Nazis, white supremacists, and those opposed to feminism - said in interviews they will press for the federal government to step in and regulate Facebook and Google, an unexpected stance for a movement that is skeptical of government meddling.
"Doofuses in the conservative movement say it's only censorship if the government does it," said Richard Spencer, an influential white nationalist. "YouTube and Twitter and Facebook have more power than the government. If you can't host a website or tweet, then you effectively don't have a right to free speech."
He added "social networks need to be regulated in the way the broadcast networks are. I believe one has a right to a Google profile, a Twitter profile, an accurate search . . . We should start conceiving of these thing as utilities and not in terms of private companies."
The censorship of hate speech by companies passes constitutional muster, according to First Amendment experts. But they said there is a downside of thrusting corporations into that role.
Silicon Valley firms may be ill-prepared to manage such a large societal role, they added. The companies have limited experience handling these issues. They must answer to shareholders and demonstrate growth in users or profits - weighing in on free speech matters risks alienating large groups of customers across the political spectrum.
These platforms are also so massive - Facebook, for example, counts a third of the world's population in its monthly user base; GoDaddy hosts and registers 71 million websites - it may actually be impossible for them to enforce their policies consistently.
Still, tech companies are forging ahead. On Wednesday, Facebook said it canceled the page of white nationalist Christopher Cantwell, who was connected to the Charlottesville rally. The company has shut down eight other pages in recent days, citing violations of the company's hate speech policies. Twitter has suspended several extremist accounts, including @Millennial_Matt, a Nazi-obsessed social media personality.
On Monday, GoDaddy delisted the Daily Stormer, a prominent neo-Nazi site, after its founder celebrated the death of a woman killed in Charlottesville, Virginia. The Daily Stormer then transferred its registration to Google, which also cut off the site. The site has since retreated to the "dark Web," making it inaccessible to most Internet users.
PayPal late Tuesday said it would bar nearly three dozen users from accepting donations on its online payment platform following revelations that the company played a key role in raising money for the white supremacist rally.
In a lengthy blog post, PayPal outlined its long-standing policy of not allowing its services to be used to accept payments or donations to organizations that advocate racist views. The payment processor singled out the KKK, white supremacist groups and Nazi groups - all three of which were involved in organizing last weekend's rally.
The Southern Poverty Law Center, a left-leaning nonprofit anti-hate group, said until now, PayPal had ignored its complaints that the company was processing donations and payments to dozens of racist and white supremacist groups. The center said PayPal also allowed at least eight groups and individuals openly espousing racist views to raise money that was integral to orchestrating the Charlottesville rally.
"For the longest time, PayPal has essentially been the banking system for white nationalism," Keegan Hankes, analyst for the Southern Poverty Law Center, told The Washington Post. "It's a shame it took Charlottesville for them to take it seriously."
PayPal has agreed to removed at least 34 organizations, including Richard Spencer's National Policy Institute, two companies that sell gun accessories explicitly for killing Muslims, as well as all accounts associated with Jason Kessler, the white nationalist blogger who organized the Charlottesville march, according to a list provided to the Post by Color of Change, a racial justice organization seeking to influence corporate decision-makers.
Spencer, whose site was blocked by major advertisers earlier this year and who previously told the Post "it would have no effect on my life whatsoever," said the PayPal move was more damaging. "I am getting this treatment because of things I say and not things I do," Spencer said. "I've never hurt anyone and I'm not going to."
Other payment systems have made similar moves. Apple Wednesday dropped payment processing for hate groups. GoFundMe, one of the largest crowdfunding sites, shut down several campaigns to raise money for the Nazi sympathizer who allegedly crashed his car into a crowd of activists protesting the hate rally, killing one woman and injuring dozens.
Patreon, another payment processor, recently canceled the accounts for some "alt-right" figures. That inspired a new crowdfunding site, Hatreon, which markets itself as a company that does not police speech.
Technology companies have long relied on a 20-year-old law that shields them from responsibility for illegal content hosted on their platforms. The more they get into the business of policing speech - making subjective decisions about what is offensive and what isn't - the more they are susceptible to undermining their own immunity and opening themselves to regulation, said Susan Benesch, director of the Dangerous Speech Project, a nonprofit group that researches the intersection of harmful online content and free speech.
Lee Rowland, senior staff attorney with the American Civil Liberty Union's Speech, Privacy & Technology Project, cautioned consumers against being so quick to condemn companies that host even the "most vile white supremacist speech we have seen on display this week."
"We rely on the Internet to hear each other," Rowland said. "We should all be very thoughtful before we demand that platforms for hateful speech disappear because it does impoverish our conversation and harm our ability to point to evidence for white supremacy and to counter it."
Authors Information: Elizabeth Dwoskin is The Washington Post's Silicon Valley Correspondent. The Washington Post's Avi Selk contributed to this report.