The European Commission is seeking to offer some comfort and reassurance after 61 organisations urged EU lawmakers in an open letter to reject the proposed EU regulation on preventing the dissemination of terrorist content online, which is due to be voted on in Parliament on 28 April. EURACTIV France reports.
“We urge the European Parliament to reject this proposal, as it poses serious threats to fundamental rights and freedoms, in particular freedom of expression and opinion, freedom of access to information, the right to privacy and the rule of law,” several associations, including Amnesty International, the Quadrature du Net, the Human Rights League of France and Reporters Without Borders said in an open letter published last week.
The signatories are particularly concerned about the regulation‘s proposed Article 4, which currently states that “hosting service providers shall remove or block access to content of a terrorist nature within one hour of receiving the removal order”.
“These rules are necessary to combat online content disseminated by terrorists in order to spread their message, radicalise and recruit followers, as well as facilitate and direct terrorist activities,” a European Commission representative told EURACTIV France.
While the intention is laudable, the organisations fear that the “short deadline imposed” will “strongly encourage platforms to deploy automated content moderation tools.”
However, “because it is impossible for automated tools to consistently differentiate activism, counter-speech, and satire about terrorism from content considered terrorist itself, increased automation will ultimately lead to the removal of legal content such as journalistic content, discriminatory treatment of minorities and certain underrepresented groups,” the letter reads.
The automatic treatment of content moderation would thus give rise to fears of some kind of algorithm-led censorship, according to the associations, who foresee hosting companies’ diligence in complying with the legislation.
The Commission has justified itself by stressing that “given the considerable volume of content disseminated on many platforms, automated tools are needed to detect potential terrorist content.”
“Purely human intervention would not be fast enough to deal with aggressive terrorist tactics,” said the Commission, adding that “content detected by automated tools will not necessarily be removed automatically; removal decisions will be subject to human oversight and verification.”
Contrary to the associations’ interpretation of the text, the Commission even went on to say that “automated tools are not automatic download filters.”
French parliament adopts new rules to combat hate online
France’s National Assembly adopted on Thursday (11 February) a series of articles and amendments for regulating hate online and on platforms and introduced steep penalties, as part of the country’s much-debated bill on “reinforcing respect for the principles of the Republic”. EURACTIV France reports.
A lack of ‘safeguards’
The text in its current form also provides for such content removal orders to be ordered by “competent authorities”, which would not necessarily be novel or specifically created for this purpose but arbitrarily designated by EU countries.
However, according to the letter’s signatories, “only courts or independent administrative authorities subject to judicial review should have the power to issue deletion orders.” Yet, what worries the associations the most is the lack of safeguards.
“The measures introduced by the proposed regulation will be easily manipulated for political censorship by unscrupulous governments. The first victims of these abuses will be journalists, artists, whistleblowers, political opponents, and marginalised religious communities, especially Muslims,” Chloé Berthélémy, policy advisor at European Digital Rights, told EURACTIV France.
“Each of the measures is accompanied by strong and effective safeguards, including complaints procedures and judicial remedies. Without this legislation on terrorist content online, companies would continue to apply their own measures, but only on a voluntary basis and without any of the safeguards in place,” the Commission replied.
The text also introduces a number of obligations for online platforms to comply with, particularly in terms of transparency.
For example, Article 8 stipulates that hosting service providers must set out their policy to prevent the dissemination of terrorist content in their terms and conditions, and that they must “publish annual transparency reports on action taken against the dissemination of terrorist content.”