Thursday 22nd November 2018

EU proposes fining tech platforms for failing to remove extremist content

The supranational organization is likely to publish a new draft regulation next month stipulating tech platforms must remove extremist content from their platforms within one hour of receiving notification. It comes months after the EU criticized Facebook and Twitter for not doing enough about the proliferation of illegal content on their respective platforms.
Jason Smith
by on 20th August 2018

The European Union is considering fining major tech platforms like YouTube and Facebook if they fail to remove extremist content from their platforms, according to a report by the Financial Times.

The new draft regulation, which will likely be published next month, is likely to demand tech platforms remove extremist content within one hour of being notified of its existence.

EU Security Commissioner Julian King told the Financial Times that too little progress has been made under the existing arrangement, whereby tech platforms institute and police their own content removal efforts.

He also said the draft regulation will help create “legal certainty” for publishers. He provided no further details on what provisions any new regulation may contain, or the potential size of fines that could be issued.

While the report in the Financial Times identifies content including “terrorist propaganda” and “extreme violence,” the full scope of content the new draft regulation will target, as well its definitions on what constitutes prohibited content, remains to be seen.

The revelation comes months after the EU criticized Facebook and Twitter for their lack of integration with the “notice and action procedure” utilized by consumer protection authorities to notify organizations about illegal content.

In a press release highlighting the existing state of affairs, the European Commission stated, “the Commission expects online platforms to swiftly and proactively detect, remove and prevent the re-appearance of illegal content online”.

Vera Jourová, the European Commissioner for Justice, Consumers and Gender Equality, also stated:

“…it is unacceptable that this is still not complete and it is taking so much time… EU consumer rules should be respected and if companies don’t comply, they should face sanctions”.

In February of this year, YouTube also announced half of all videos promoting violent extremism were removed from its platform before they received 10 views. It also claimed the result was made possible by advances in machine learning.

Part of the problem is that over 1 billion hours of YouTube video is watched every day, while as many as 330 million posts are published to Facebook every 60 seconds.

Unfortunately, in consideration of these types of numbers, and consumers’ apparent desire to take advantage of the mass publishing and consumption opportunities offered by social networks, it difficult to envisage effective measures being deployed that consist of anything other than automation.