News organisations such as MailOnline should be given a 'positive exemption' from the provisions of the new Online Safety Bill, an industry representative said today.
Peter Wright, editor emeritus of DMG Media, said social media platforms should not moderate journalistic content when it is produced by 'recognised news publishers'.
He told a Parliamentary committee that he was 'hugely sceptical' about Facebook's teams of fact checkers with some appearing to be 'single-issue lobby groups'.
Speaking to the Joint Committee on the Draft Online Safety Bill, he also pointed out that Google and Facebook's algorithms and artificial intelligence are 'very poor'.
Mr Wright added DMG's titles including MailOnline were already 'fully subject to the law' and there could be a case where it is following the Independent Press Standards Organisation (Ipso) code of conduct but breaking a US tech firm's terms of service.
He wants a positive exemption for journalism from the provisions of the bill, which aims to establish a new regulatory framework to tackle harmful content online.
Facebook and Google will fall within the scope of the new bill, which will give UK regulator Ofcom the power to hand out multi-million pound fines to tech companies.
Peter Wright, editor emeritus of DMG Media, whose brands include MailOnline and the Daily Mail, gives evidence to the Joint Committee on the Draft Online Safety Bill today
Mr Wright was asked today about whether the protection given to journalistic organisations from the duty of care principles within the bill was sufficient.
He said: 'My reading of the bill is that we're protected in the first case by the fact that the duty of care does not apply to our content, both on our own websites but also when it's distributed on search and social media, so there's no obligation on the platforms to censor our content.
'However, the problem comes because there is also no compulsion on them not to, and clearly the authors of the bill envisage that they will block and take down items of content because the journalistic protections are there.
'And the journalistic protections specifically apply to news publisher content, so we have to look at the journalistic protections and ask how effective they are, and in my view they're not effective.'
He added that the moderation of content by social media firms will be done by algorithm, and the bill puts them under threat of possible criminal penalties.
Mr Wright continued: 'And their inevitable response to that will be to set the parameters of any moderation they do as widely as possible. It's human nature – they'll want to protect themselves. So they'll be using a very blunt instrument.
'I saw in Google's submission, they say their algorithms are very poor at understanding context, they're going to find moderating journalism particularly difficult.
'And we also know from articles in the Wall Street Journal over the weekend that Facebook's artificial intelligence is very poor at actually moderating this type of content. But what does the bill demand of them? That they take freedom of expression into account?
'Well, that can mean almost anything. It's left to the platforms to determine how they do this, what rules they set.'
The Wall Street Journal claimed that documents revealed Facebook's artificial intelligence cannot consistently identify first-person shooting videos, racist rants and even the difference between cockfighting and car crashes.
Mr Wright continued: 'From what I've seen of how Facebook have been trying to moderate journalism in the USA – where they're doing it for completely different reasons, they had an advertiser boycott last year, which has prompted them to do this - it's arbitrary, it often fails to understand the nature of the content, it's imposed without any sort of process, it is not in line with