News
Facebook Is Taking Down Racist Hate Group's Pages After Charlottesville
Social networking sites have long provided a platform for white supremacists and other hate groups to create online communities, exchange extremist ideas, and plan meetings and rallies. Following Saturday's deadly attack at a "Unite the Right" rally in Charlottesville, VA, many tech companies are now attempting to make it a little more difficult for these communities to congregate online. As a part of that effort, Facebook removed a number of hate group pages from its platform on Tuesday, many of which promoted terror attacks and hate crimes.
According to Buzzfeed, Facebook confirmed that the company deleted the pages of at least eight hate groups since the events in Charlottesville, including White Nationalists United, Genuine Donald Trump, and Vanguard America. A statement released by a Facebook spokesperson explained the company's decision.
Our hearts go out to the people affected by the tragic events in Charlottesville. Facebook does not allow hate speech or praise of terrorist acts or hate crimes, and we are actively removing any posts that glorify the horrendous act committed in Charlottesville.
While Facebook already prohibits "organized hate groups" and has vowed to continue removing groups that promote violence and hate speech, the social networking site does not plan on updating its terms of service or user guidelines in the wake of the attack, according to Recode.
Reddit made a similar decision this week, removing a subreddit for the hate group Physical Removal after Reddit community members called on the site to shut down the page for sharing racist ideology and promoting violence.
"We are very clear in our site terms of service that posting content that incites violence will get users banned from Reddit," a Reddit spokesperson told CNET about its decision.
Facebook and Reddit are joining a host of other tech companies who are standing against white supremacists and making it more difficult for them to use their services. The domain name provider Go Daddy recently revoked the domain for a neo-Nazi group, and that same group's registration was then promptly cancelled by Google Domains as well. And over the weekend, Airbnb deactivated the accounts of neo-Nazis who were attempting to find housing so they could attend the "Unite the Right" rally in Charlottesville.
All of these instances can certainly be seen as proof that the tech industry is doing some serious soul-searching to determine exactly what can be done to stop the spread of extremist messages. But given that tech companies have the enormous privilege and responsibility of connecting individuals from across the globe, they have a long way to go in figuring out what can be done to ensure that an event like Charlottesville doesn't happen again.