Security is tight at this brick building on the western edge of Berlin. Inside, a sign warns: "Everybody without a badge is a potential spy!"
Spread over five floors, hundreds of men and women sit in rows of six scanning their computer screens. All have signed nondisclosure agreements. Four trauma specialists are at their disposal seven days a week.
They are the agents of Facebook.
And they have the power to decide what is free speech and what is hate speech.
This is a deletion center, one of Facebook's largest, with more than 1,200 content moderators. They are cleaning up content — from terrorist propaganda to Nazi symbols to child abuse — that violates the law or the company's community standards.
Germany, home to a tough new online hate speech law, has become a laboratory for one of the most pressing issues for governments today: How and whether to regulate the world's biggest social network.
Around the world, Facebook
and other social networking platforms
are facing a backlash over their failures to safeguard privacy, disinformation campaigns and the digital reach of hate groups.
In India, seven people were beaten to death after a false viral message on the Facebook
subsidiary WhatsApp. In Myanmar, violence against the Rohingya minority was fueled, in part, by misinformation spread on Facebook.
In the United States, Congress called Mark Zuckerberg, Facebook's chief executive, to testify about the company's inability to protect its users' privacy.
As the world confronts these rising forces, Europe, and Germany in particular, have emerged as the de facto regulators
of the industry, exerting influence beyond their own borders. Berlin's digital crackdown on hate speech, which took effect on January 1, is being closely watched by other countries. And German officials are playing a major role behind one of Europe's most aggressive moves to rein in technology companies, strict data privacy rules that take effect across the European Union on May 25 and are prompting global changes.
"For them, data is the raw material that makes them money," said Gerd Billen, secretary of state in Germany's Ministry of Justice and Consumer Protection. "For us, data protection is a fundamental right that underpins our democratic institutions."
Germany's troubled history has placed it on the front line of a modern tug-of-war between democracies and digital platforms.
In the country of the Holocaust, the commitment against hate speech is as fierce as the commitment to free speech. Hitler's "Mein Kampf" is only available in an annotated version. Swastikas are illegal. Inciting hatred is punishable by up to five years in jail.
But banned posts, pictures and videos have routinely lingered on Facebook
and other social media platforms.
The deletion center predates the legislation, but its efforts have taken on new urgency. Every day content moderators in Berlin, hired by a third-party firm and working exclusively on Facebook, pore over thousands of posts flagged by users as upsetting or potentially illegal and make a judgment: Ignore, delete or, in particularly tricky cases, "escalate" to a global team of Facebook
lawyers with expertise in German regulation. Some decisions to delete are easy. Posts about Holocaust denial and genocidal rants against particular groups like refugees are obvious ones for taking down.
Others are less so. On December 31, the day before the new law took effect, a far-right lawmaker reacted to an Arabic New Year's tweet from the Cologne police, accusing them of appeasing "barbaric, Muslim, gang-raping groups of men."
The request to block a screenshot of the lawmaker's post wound up in the queue of Nils, a 35-year-old agent in the Berlin deletion center. His judgment was to let it stand. A colleague thought it should come down. Ultimately, the post was sent to lawyers in Dublin, London, Silicon Valley and Hamburg. By the afternoon it had been deleted, prompting a storm of criticism about the new legislation, known here as the "Facebook
"A lot of stuff is clear-cut," Nils said. Facebook, citing his safety, did not allow him to give his surname. "But then there is the borderline stuff."
Complicated cases have raised concerns that the threat of the new rules' steep fines and 24-hour window for making decisions encourage "over-blocking" by companies, a sort of defensive censorship of content that is not actually illegal.
The far-right Alternative of Germany, a noisy and prolific user of social media, has been quick to proclaim "the end of free speech." Human rights organizations have warned that the legislation was inspiring authoritarian governments to copy it.
Other people argue that the law simply gives a private company too much authority to decide what constitutes illegal hate speech in a democracy, an argument that Facebook, which favoured voluntary guidelines, made against the law.
"It is perfectly appropriate for the German government to set standards," said Elliot Suchrage, Facebook's vice president of communications and public policy. "But we think it's a bad idea for the German government to outsource the decision of what is lawful and what is not."
Richard Allan, Facebook's vice president for public policy in Europe and the leader of the company's lobbying effort against the German legislation, put it more simply: "We don't want to be the arbiters of free speech."
German officials counter that social media platforms are the arbiters anyway.
It all boils down to one question, said Mr. Billen, who helped draw up the new legislation: "Who is sovereign? Parliament or Facebook?
Learning From (German) History
When Nils applied for a job at the deletion center, the first question the recruiter asked him was: "Do you know what you will see here?"
Nils has seen it all. Child torture. Mutilations. Suicides. Even murder: He once saw a video of a man cutting a heart out of a living human being.
And then there is hate.
"You see all the ugliness of the world here," Nils said. "Everyone is against everyone else. Everyone is complaining about that other group. And everyone is saying the same horrible things."
The issue is deeply personal for Nils. He has a 4-year-old daughter. "I'm also doing this for her," he said.
The center here is run by Arvato, a German service provider owned by the conglomerate Bertelsmann. The agents have a broad purview, reviewing content from a half-dozen countries. Those with a focus on Germany must know Facebook's community standards and, as of January, the basics of German hate speech and defamation law.
"Two agents looking at the same post should come up with the same decision," says Karsten König, who manages Arvato's partnership with Facebook.
The Berlin center opened with 200 employees in 2015, as Germany was opening its doors to hundreds of thousands of migrants.
That year a selfie went viral.
Anas Modamani, a Syrian refugee, posed with Chancellor Angela Merkel and posted the image on Facebook.
It instantly became a symbol of her decision to allowing in hundreds of thousands of migrants.
Soon it also became a symbol of the backlash.
The image showed up in false reports linking Mr. Modamani to terrorist attacks in Brussels and on a Christmas market in Berlin. He sought an injunction against Facebook
to stop such posts from being shared but eventually lost.
The arrival of nearly 1.4 million migrants in Germany has tested the country's resolve to keep a tight lid on hate speech. The law on illegal speech was long-established but enforcement in the digital realm was scattershot before the new legislation.
Posts calling refugees rapists, Neanderthals and scum survived for weeks, according to jugendschutz.net, a publicly funded internet safety organization. Many were never taken down. Researchers at jugendschutz.net reported a tripling in observed hate speech in the second half of 2015.
Mr. Billen, the secretary of state in charge of the new law, was alarmed. In September 2015, he convened executives from Facebook
and other social media sites at the justice ministry, a building that was once the epicenter of state propaganda for the Communist East. A task force for fighting hate speech was created. A couple of months later, Facebook
and other companies signed a joint declaration, promising to "examine flagged content and block or delete the majority of illegal posts within 24 hours."
But the problem did not go away. Over the 15 months that followed, independent researchers, hired by the government, twice posed as ordinary users and flagged illegal hate speech. During the tests, they found that Facebook
had deleted 46 percent and 39 percent.
"They knew that they were a platform for criminal behavior and for calls to commit criminal acts, but they presented themselves to us as a wolf in sheep skin," said Mr. Billen, a poker-faced civil servant with stern black frames on his glasses.
By March 2017, the German government had lost patience and started drafting legislation. The Network Enforcement Law was born, setting out 21 types of content that are "manifestly illegal" and requiring social media platforms to act quickly.
Officials say early indications suggest the rules have served their purpose. Facebook's performance on removing illegal hate speech in Germany rose to 100 percent over the past year, according to the latest spot check of the European Union.
Platforms must publish biannual reports on their efforts. The first is expected in July.
At Facebook's Berlin offices, Mr. Allan acknowledged that under the earlier voluntary agreement, the company had not acted decisively enough at first.
"It was too little and it was too slow," he said. But, he added, "that has changed."
He cited another independent report for the European Commission from last summer that showed Facebook
was by then removing 80 percent of hate speech posts in Germany.
The reason for the improvement was not German legislation, he said, but a voluntary code of conduct with the European Union. Facebook's results have improved in all European countries, not just in Germany, Mr. Allan said.
"There was no need for legislation," he said.
Mr. Billen disagrees.
"They could have prevented the law," he said. YouTube scored 90 percent in last year's monitoring exercise. If other platforms had done the same, there would be no law today, he said.
A Regulatory Dilemma
Germany's hard-line approach to hate speech and data privacy once made it an outlier in Europe. The country's stance is now more mainstream, an evolution seen in the justice commissioner in Brussels.
Vera Jourova, the justice commissioner, deleted her Facebook
account in 2015 because she could not stand the hate anymore.
"It felt good," she said about pressing the button. She added: "It felt like taking back control."
But Ms. Jourova, who grew up behind the Iron Curtain in what is now the Czech Republic, had long been skeptical about governments legislating any aspect of free speech, including hate speech. Her father lost his job after making a disparaging comment about the Soviet invasion in 1968, barring her from going to university until she married and took her husband's name.
"I lived half my life in the atmosphere driven by Soviet propaganda," she said. "The golden principle was: If you repeat a lie a hundred times it becomes the truth."
When Germany started considering a law, she instead preferred a voluntary code of conduct. In 2016, platforms like Facebook
promised European users easy reporting tools and committed to removing most illegal posts brought to their attention within 24 hours.
The approach worked well enough, Ms. Jourova said. It was also the quickest way to act because the 28 member states in the European Union differed so much about whether and how to legislate.
But the stance of many governments toward Facebook
has hardened since it emerged that the consulting firm Cambridge Analytica had harvested the personal data of up to 87 million users. Representatives of the European Parliament have asked Mr. Zuckerberg to come to Brussels to "clarify issues related to the use of personal data" and he has agreed to come as soon as next week.
Ms. Jourova, whose job is to protect the data of over 500 million Europeans, has hardened her stance as well.
"Our current system relies on trust and this did nothing to improve trust," she said. "The question now is how do we continue?"
The European Commission is considering German-style legislation for online content related to terrorism, violent extremism and child pornography, including a provision that would include fines for platforms that did not remove illegal content within an hour of being alerted to it.
Several countries - France, Israel, Italy, and Canada among them - have sent queries to the German government about the impact of the new hate speech law.
And Germany's influence is evident in Europe's new privacy regulation, known as the General Data Protection Regulation, or G.D.P.R.. The rules give people control over how their information is collected and used.
Inspired in part by German data protection laws written in the 1980s, the regulation has been shaped by a number of prominent Germans. Ms. Jourova's chief of staff, Renate Nikolay, is German, as is her predecessor's chief of staff, Martin Selmayr, now the European Commission's secretary general. The lawmaker in charge of the regulation in the European Parliament is German, too.
"We have built on the German tradition of data protection as a constitutional right and created the most modern piece of regulation of the digital economy," Ms. Nikolay said.
"To succeed in the long-term companies needs the trust of customers," she said. "At the latest since Cambridge Analytica it has become clear that data protection is not just some nutty European idea, but a matter of competitiveness."
On March 26, Ms. Jourova wrote a letter - by post, not email - to Sheryl Sandberg, Facebook's chief operating officer.
"Is there a need for stricter rules for platforms like those that exist for traditional media?" she asked.
"Is the data of Europeans affected by the current scandal?" she added, referring to the Cambridge Analytica episode. And, if so, "How do you plan to inform the user about this?"
She demanded a reply within two weeks, and she got one. Some 2.7 million Europeans were affected, Ms. Sandberg wrote.
But she never answered Ms. Jourova's question on regulation.
"There is now a sense of urgency and the conviction that we are dealing with something very dangerous that may threaten the development of free democracies," said Ms. Jourova, who is also trying to find ways to clamp down on fake news and disinformation campaigns.
"We want the tech giants to respect and follow our legislation," she added. "We want them to show social responsibility both on data protection and on hate speech."
So do many Facebook
employees, Mr. Allan, the company executive, said.
"We employ very thoughtful and principled people," he said. "They work here because they want to make the world a better place, so when an assumption is made that the product they work on is harming people it is impactful."
"People have felt this criticism very deeply," he said.
A Visual Onslaught
Nils works eight-hour shifts. On busy days, 1,500 user reports are in his queue. Other days, there are only 300. Some of his colleagues have nightmares about what they see.
Every so often someone breaks down. A mother recently left her desk in tears after watching a video of a child being sexually abused. A young man felt physically sick after seeing a video of a dog being tortured. The agents watch teenagers self-mutilating and girls recounting rape.
They have weekly group sessions with a psychologist and the trauma specialists on standby. In more serious cases, the center teams up with clinics in Berlin.
In the office, which is adorned with Facebook
logos, fresh fruit is at the agents' disposal in a small room where subdued colors and decorative moss growing on the walls are meant to calm fraying nerves.
To decompress, the agents sometimes report each other's posts, not because they are controversial, but "just for a laugh," said another agent, the son of a Lebanese refugee and an Arabic-speaker who has had to deal with content related to terrorism generally and the Islamic State specifically. By now, he said, images of "weird skin diseases" affected him more than those of a beheading. Nils finds sports injuries like breaking bones particularly disturbing.
There is a camaraderie in the office and a real sense of mission: Nils said the agents were proud to "help clean up the hate."
The definition of hate is constantly evolving.
The agents, who initially take a three-week training course, get frequent refreshers. Their guidelines are revised to reflect hate speech culture. Events change the meaning of words. New hashtags and online trends must be put in context.
"Slurs can become socialized," Mr. Allan of Facebook
"Refugee" became a group protected from the broad hate speech rules only in 2015. "Nafri" was a term used by the German police that year to describe North Africans who sexually harassed hundreds of women, attacking and, in some cases, raping them. Since then, Nafri has become a popular insult among the far-right.
Nils and his colleagues must determine whether hateful content is singling out an ethnic group or individuals.
That was the challenge with a message on Twitter that was later posted to Facebook
as a screenshot by Beatrix von Storch, deputy floor leader of the far-right party, AfD.
"What the hell is wrong with this country?" Ms. von Storch wrote on Dec. 31. "Why is an official police account tweeting in Arabic?"
"Do you think that will appease the barbaric murdering Muslim group-raping gangs of men?" she continued.
A user reported the post as a violation of German law, and it landed in Nils's queue. He initially decided to ignore the request because he felt Ms. von Storch was directing her insults at the men who had sexually assaulted women two years earlier.
Separately, a user reported the post as a violation of community standards. Another agent leaned toward deleting it, taking it as directed at Muslims in general.
They conferred with their "subject matter expert," who escalated it to a team in Dublin.
For 24 hours, the post kept Facebook
lawyers from Silicon Valley to Hamburg busy. The Dublin team decided that the post did not violate community standards but sent it on for legal assessment by outside lawyers hired by Facebook
Within hours of news that the German police were opening a criminal investigation into Ms. von Storch over her comments, Facebook
restricted access to the post. The user who reported the content was notified that it had been blocked for a violation of section 130 of the German criminal code, incitement to hatred. Ms. von Storch was also notified too.
In the first few days of the year, it looked like the platforms were erring on the side of censorship. On Jan. 2, a day after Ms. von Storch's post was deleted, the satirical magazine Titanic quipped that she would be its new guest tweeter. Two of the magazine's subsequent Twitter posts mocking her were deleted. When Titanic published them again, its account was temporarily suspended.
Since then, things have calmed down. And even Mr. Allan conceded: "The law has not materially changed the amount of content that is deleted."
©2018 The New York Times News Service