Instagram, which is owned by Facebook, is making adjustments to make the service safer for teenagers. Anyone under the age of 16 (or 18 in some countries) who signs up for the service will now have their account set to private by default, however the option to switch to public will still be available. Anyone under these ages who has a public account will receive a message advising them to change it to private.
For a long time, Instagram has been pushing toward making private accounts the default for young people. It began broadcasting a message touting the merits of having a private account to young individuals joining up for Instagram in March. It is now defaulting to private.
Facebook is also making changes to how advertisers can target individuals under the age of 18 on the platform. Previously, any user could be targeted based on their interests and behavior, which Facebook gathers from all over the internet, not just its own properties, by analyzing people’s web browser history, app usage, and other data. Advertisers can now only target consumers under the age of 18 based on their age, gender, and geography. This includes Instagram, Messenger, and Facebook users.
READ ALSO: United Airlines will let passengers preorder food and snacks online before flights
Instagram claims to be doing more to minimize how problematic individuals communicate with users under the age of 16 on the platform. According to the business, it can detect “possibly suspect conduct” in accounts. This could indicate that the account was recently blocked or reported, for example, by a younger person. These suspicious users will be virtually isolated from users under the age of 16: they will not see under-16 accounts in their Explore, Reels, or Accounts Suggested For You pages, nor will they see under-16 comments on others’ posts or be able to comment on content created by users under the age of 16.
Instagram’s head of public policy, Karina Newton, told NBC News, “We’re trying to find out if an adult is demonstrating questionable behavior.” “The adult may not have broken the rules yet, but he or she may be doing things that cause us to reconsider them.”
When kids got a direct message from one of these individuals, Instagram used its capacity to recognize dubious profiles to alert them. It also prevented adults from messaging teenagers who were not already following them on social media.
While Facebook is aiming to make Instagram safer and more private for teenagers, it is still creating an app for children under the age of 13. (the current minimum age to sign up for Instagram). BuzzFeed News first published the intentions in March, and they drew widespread criticism and concerns.
Instagram’s under-13 app is still being developed, according to Newton, who said the business is in “deep collaboration with experts in child development and privacy advocates” to fulfill the “needs of families and adolescents.”
“We’re attempting to design something that’s appealing for tweens and works for parents,” Newton explained.