Tech firms have been urged to show that they are removing extremist content more rapidly or face legislation forcing them to do so, new EU guidelines reveal.
Google, YouTube, Facebook, Twitter and others have been given three months to clean up their acts and tackle terrorist material published on their sites.
The guidance, which is not legally binding, includes a call to remove such material within an hour of being notified of its existence.
Failure to comply could result in new laws being brought in to make it a mandatory requirement.
Tech firms have been urged to show that they are removing extremist content more rapidly or face legislation forcing them to do so, new EU guidelines reveal. Google, YouTube, Facebook, Twitter and others have been given three months to clean up their acts
New measures include a call to remove such material within an hour of being notified of its existence. This image shows masked figures marching in a recruitment video for the banned far-Right National Action, which has since been taken down from YouTube
In its strongest call yet the European Commission, based in Brussels, today recommended a range of new measures that online platforms should take to stop the proliferation of extremist content.
European governments have said that extremist content on the web has influenced lone-wolf attackers who have killed people in several European cities after being radicalised.
Several governments have increased pressure on social media companies to do more to remove illegal content.
This includes material related to groups like Islamic state as well as individual incitements to commit atrocities.
'While several platforms have been removing more illegal content than ever before, we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens' security, safety and fundamental rights,' digital commissioner Andrus Ansip said in a written statement.
The recommendation, which is non-binding but could be taken into account by European courts, sets guidelines on how companies should remove illegal content generally.
This ranges from copyright infringements to hate speech and advises a quicker reaction to extremist material.
The Commission said it would assess the need for legislation within three months for what it described as 'terrorist content', given the urgency of the issue.
The EU has unveiled a recommendation that sets out measures to ensure faster detection and removal of illegal content online. They include:
Clearer 'notice and action' procedures: Companies should set out easy and transparent rules for notifying illegal content, including fast-track procedures for 'trusted flaggers'.
More efficient tools and proactive technologies: Companies