This week, the new Network Enforcement Act law went into effect in Germany, requiring large Internet platforms to set up systems to report and even remove certain types of content, including threats of violence, slander, hate speech, fake news and other illegal materials. According to The Local.com, Facebook has already hired several hundred people in Germany to begin scouring the site for potential violations.
With "fake news" being one of the most-used terms of 2017, and the effects of that fake news on the 2016 United States presidential election still not fully understood, the initial reaction may be to applaud this law as one that could work to avoid dangerous information silos in the future. But not so fast. While keeping hate speech from carrying undue weight is a lofty goal, at least here in the U.S., it can't come at the expense of free speech—as evidenced by the many times the U.S. Supreme Court has ruled that, while unpopular, the views of white supremacists and the like cannot be banned by the government.
Yet, don't take our support of the tenets of the First Amendment as evidence that we don't support something being done to limit hate speech on social media sites. There are other ways to skin this cat that don't butt up against free speech laws. For one, Title 47 of the U.S. Telecommunications Code states:
"No provider or user of an interactive computer service shall be held liable on account of—any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected."
In other words, social media platforms cannot be held liable should they decide to ban things like hate speech. Furthermore, laws against defamation and making false statements only offer more ammunition for social media platforms to police their platforms more diligently.
Facebook and Reddit both began banning hate groups in the wake of the white supremacist rallies in Charlottesville, Va. last year. Today, Facebook's Community Standards page states:
"We also remove content that expresses support for groups that are involved in the violent or criminal behavior mentioned above. Supporting or praising leaders of those same organizations, or condoning their violent activities, is not allowed." It took the death of a bystander in Charlottesville for that action to happen.
While the German set of standards rubs against our desire for free speech, without pressure, governmental or public, large social media companies simply will not act. You don't see social media sites hiring hundreds of employees to enforce standards of decency domestically. It shouldn't take further harm to occur before social media sites begin to act. Companies can be compelled to act appropriately through the sheer will of their users—especially when things including the health of our democratic elections hang in the balance. Currently, what they're doing is not enough.
Germany has a different set of standards and laws, and while we applaud their government taking action to shut down fake news and hate speech, legislation isn't the only way to achieve the desired result.