Instagram brings enhanced self-harm content detection tools to the UK
The new moderation tools are able to more proactively spot self-harm content and automatically make it less visible in the app
Instagram is introducing new technology to its app in Europe that is able to better identify suicide and self-harm content which breaks the app’s rules.
The new moderation tools are able to more proactively spot self-harm content and automatically make it less visible in the app, and in some cases remove it completely after 24 hours if the machine learning is confident it breaks the site’s rules.



