Facebook missing far-right hate speech in Ireland 

Facebook missing far-right hate speech in Ireland 

Hateful content on Facebook within Ireland is often not removed due to moderators "having no local context" regarding what they’re seeing, it has emerged.

Hateful content on Facebook within Ireland is often not removed due to moderators "having no local context" regarding what they’re seeing, it has emerged.

The revelation emerged from a first-of-its-kind meeting between Facebook Ireland’s public policy and misinformation teams and the civil society groups Uplift and the Far Right Observatory held on Wednesday to discuss the twin issues of hate speech and misinformation on the platform.

Facebook’s representatives at the meeting reiterated the company’s policy emphasis on the individual reporting of harmful content.

“The civil society groupings focused on the issue of content which has been flagged previously but which nevertheless remains on the platform,” a source with knowledge of the meeting told the Irish Examiner.

“That gave rise to the unprompted acknowledgement that lots of hateful content doesn’t get reviewed because the reviewers have no local context,” they said.

“Automated moderators likewise have no context – they work off keywords.”

It is understood one Facebook account was raised at the meeting, an individual who was purportedly removed from the site after making persistent threats against healthcare workers. However, that account remains active, and the individual had posted to their account immediately prior to the meeting.

While no fixed date for a further meeting emerged from the engagement, the source stressed Facebook “look like they’re going to think about it”.

“Convincing arguments were made, Facebook listened keenly. There was no refuting the content which had been tracked and described at the meeting, and the company’s representatives present voiced that they don’t want to see that content on Facebook either,” they said.

“We regularly engage with external stakeholders in Ireland including civil society groups and safety organisations,” a Facebook spokesperson said.

“We met with representatives from Uplift and the Far Right Observatory to listen to their concerns and to explain the policies and procedures we have in place to address content such as hate speech and misinformation.”

They said Facebook was “continually updating our policies to tackle harmful content on our platforms and we’ll continue to engage with these organisations for their valuable insight”.

A spokesperson for the Far Right Observatory, meanwhile, said they “welcomed” the engagement with Facebook.

“Many of the affected communities and groups we work closely with have long felt the impacts of hate organising on the platform,” they said, adding they had expressed their experience that Facebook’s current reporting system “doesn’t work”, while the company’s public statements “very often don’t match up with genuine meaningful action”.

The spokesperson said the Facebook representatives present had “recognised” that the emphasis on reporting individual pieces of content is “small-scale, yet seems to be the main solution on offer to what we and many other across the globe see as a systemic problem”.

They added that a particular account of concern at present, one with links to international far right networks listed on Facebook’s own dangerous individuals and organisations list, has yet to be examined by the company.

”We will have to wait and see what happens there,” they said.

x

More in this section

Lunchtime News

Newsletter

Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited