Facebook: We’re not here to censor content

Facebook’s head of safety has told a cyberbullying conference that the social network’s anti-cyberbullying standards have “come a long way” in the past four years — but “we are not here to censor content”.

Patricia Cartes underlined how all new developments on the site were governed by the company’s working mission of “giving people the power to share and make the world more open” but “one of the first things that we examine on a new project is how this development could be abused”.

She said she “would not support” comments made by Ask.fm founder Mark Terebin that cyberbullying was more rife in Ireland and the UK than anywhere else.

And on when deciding whether a comment should be removed from the site, she said the company’s user operations team did “draw a line” between a comment or image “causing offence or it causing harm”.

Ms Cartes constantly repeated the importance of users becoming more aware of the ways of reporting harmful content and of Facebook’s help centre. She said the company will not read, or want to read, every message posted on Facebook by its 1bn users. She said it can only respond to reported complaints.

Up to 150 parents and teachers attended the conference organised by MEP Seán Kelly at Nemo Rangers in Cork yesterday.

Discussing community standards, Ms Cartes said that if suicidal content is reported, Facebook contacts the Samaritans so they can provide help.

And defending the company’s safety policy, she also said any questionable link can be reported to Facebook confidentially. She also said families of deceased users can request that their pages, or tribute pages, be removed. “We also have a new channel which allows a person to report harmful content to an authority figure, for instance another adult or a teacher.

“By supplying an email address if they don’t have a Facebook account, we can highlight the content. With this, we will increase face- to-face resolution.”

Ms Cartes said Facebook was not operating without recourse to national government, but was governed by the legal frameworks of the various countries.

“For instance, holocaust denial is illegal in Germany and any such content will be removed in Germany — we won’t remove it in Sweden where such pronouncements are not forbidden.”

Under attack from parents and teachers, she said the company took a “tough stance with fake accounts”.

“Fake accounts are a violation of our terms and we take preventative action such as blocking common fake usernames such as something to do with Jedward.

“We also have technology that can see if a particular user has an enormous amount of friends, more than is normal. We have technology that will allow us to block this user until they provide ID proving that they are who they really are. If they don’t their account can be removed.”

Earlier, Anthony Whelan, head of cabinet for EU commissioner Neelie Kroes, criticised the social media companies for failing to voluntarily agree to provide fuller information to users about how complaints were processed. “Is it enough that we know how a complaint is typically handled? Do we need individual feedback as to how it is handled?” he argued.

Ms Cartes countered this by saying if a Facebook user reports content, “they can go to the report dashboard and [that] will tell you the policy behind the decision”.

More in this section

Price info
IE_180_logo
Price info

Subscribe to unlock unlimited digital access.
Cancel anytime.

Terms and conditions apply

Puzzles logo
IE-logo

Puzzles hub

Visit our brain gym where you will find simple and cryptic crosswords, sudoku puzzles and much more. Updated at midnight every day. PS ... We would love to hear your feedback on the section right HERE.

Puzzles logo
IE-logo

Puzzles hub

Visit our brain gym where you will find simple and cryptic crosswords, sudoku puzzles and much more. Updated at midnight every day. PS ... We would love to hear your feedback on the section right HERE.