Instagram has announced the launch of a tool to enable users to automatically filter abusive messages from those they are doing not follow on the platform.
It follows a variety of footballers speaking out about experiencing racism, sexism, and other abuse on Instagram.
Direct messages (DMs) containing words or emojis deemed offensive are going to be far away from view.
The tool is going to be available within the UK, France, Ireland, Germany, Australia, New Zealand, and Canada within weeks.
More countries will then receive the relevant update within the coming months.
“Because DMs are private conversations, we do not proactively search for hate speech or bullying an equivalent way we do elsewhere,” Instagram blogged.
The tool focused on message requests from people users didn’t already follow “because this is often where people usually receive abusive messages”, it added.
Instagram consulted with anti-discrimination and anti-bullying groups to curate an inventory of terms, phrases, and emojis deemed offensive.
For example, Liverpool Football Club criticized the platform after a number of its players were sent racist monkey emojis.
But users also can add their definitions to the present list, through the Hidden Words section of the app’s privacy settings.