Big Tech companies have urged the EU not to hold them legally liable for all the content on their platforms but accepted that their efforts to remove illegal and harmful activity might require oversight from a new bloc-level watchdog. Lobbyists representing the likes of Google, Facebook and Twitter have written to the European Commission as it draws up a Digital Services Act to set out rules for the technology sector.
There has been growing pressure for Silicon Valley executives to be made personally accountable for illegal material on their platforms and for tech groups to face greater scrutiny over how they police content. Until now, the EU has allowed platforms to regulate themselves for illegal material in everything except terrorist content, and not to be held legally liable for illegal content — such as hate speech or videos of child sex abuse — of which they are unaware.
The letter, from Edima, argued that both self-regulation and limited liability should continue but that “a new approach might require some form of oversight to ensure it is effective”. The lobbyists warned that making companies liable for all content on their platforms would lead to punishments for companies that tried, proactively, to uncover illegal material.
Such rules would create “a perverse incentive whereby companies are discouraged from taking action before being made aware of the existence of illegal content, for fear of incurring additional liability”, said Siada El Ramly, director-general of Edima.
Thierry Breton, the European commissioner charged with overseeing the digital economy, has already said that the EU would not seek to remove or water down limited liability for tech companies. “We will not touch it,” he said, at his confirmation hearings before the EU parliament. But senior officials have privately warned since then that the DSA is still being debated, that it is too soon to rule out any outcomes and that the process is bound to be “messy”.
In October, an EU official called the DSA “a bulldozer which will take five years” to agree. In 2013, Brussels looked to introduce new rules on how platforms should process complaints from users on illegal material but ditched plans after internal disagreement and political pressure. Two years later it launched a consultation on limited liabilities for platforms but that did not lead to a change in the law.
In September, Werner Stengg, former head of the European Commission’s ecommerce and platforms unit, said officials and companies were still unclear about the rules designed to secure the removal of illegal online content. “Illegal content is not sufficiently addressed. No one knows what is hosted on online platforms, what content is not removed; we don’t know the scale of the problem,” he said.