Instagram lets users filter out abusive messages
- Published
Instagram has announced the launch of a tool to enable users to automatically filter out abusive messages from those they do not follow on the platform.
It follows a number of footballers speaking out about experiencing racist, sexist and other abuse on Instagram.
Direct messages (DMs) containing words or emojis deemed offensive will be removed from view.
The tool will be available in the UK, France, Ireland, Germany, Australia, New Zealand and Canada within weeks.
More countries will then receive the relevant update in the coming months.
"Because DMs are private conversations, we don't proactively look for hate speech or bullying the same way we do elsewhere," Instagram blogged, external.
The tool focused on message requests from people users did not already follow “because this is where people usually receive abusive messages”, it added.
Monkey emojis
Instagram consulted with anti-discrimination and anti-bullying groups to curate a list of terms, phrases and emojis deemed offensive.
For example, Liverpool Football Club criticised the platform after some of its players were sent racist monkey emojis.
But users can also add their own definitions to this list, through the Hidden Words section of the app’s privacy settings.
This feature already exists to filter out abuse in comments on Instagram posts.
Related topics
- Attribution
- Published7 April 2021
- Attribution
- Published10 February 2021
- Attribution
- Published9 April 2021
- Published15 April 2021