Instagram has introduced a new set of policies meant to limit interactions between adults and teenagers. This is all in an effort to make the platform much safer for its young users. The site will now ban all adult users from sending direct messages to teenagers who don’t follow them. This comes alongside an introduction of new “safety prompts” that will be revealed to teenage users when they DM adults who have been “exhibiting potentially suspicious behaviour.”
Young users will be given the option to either report or block adults who are messaging them. The prompts will be sure to remind them not to feel pressured to respond to DMs as well as being “careful sharing photos, videos, or information with someone you don’t know.”
Whenever Instagram spots any suspicious behaviour from adult users, the young users will instantly receive notices. It’s not clear how the moderation systems work although the company says such behaviour could include sending “a large amount of friend or message requests to people under 18.”
The social network had also revealed that it is working on a new AI machine learning technology that will be used to detect someone’s age when they create an account. However, nothing about how the system will operate was made clear.
New teenage users who sign up to the site will also now be encouraged to make their profiles private. This will be done by sending notifications “highlighting the benefits of a private account and reminding them to check their settings.”