According to an internal post obtained by BuzzFeed News and a statement from Mark Zuckerberg given at a Congressional hearing earlier in March, Facebook is currently prioritizing the development of an Instagram for Kids, specifically targeting audiences below the age of 13.
Instagram’s vice president of product, Vishal Shah, wrote to employees on the company message board in early March, stating: “I’m excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list.”
As of right now, Instagram does not allow individuals under the age of 13 to use the service. However, the protections put in place to prevent younger teens and children from joining the app are not very strong.
Some speculate that the decision to create a new app specifically for younger audiences comes in part from an interest in protecting existing younger users from the negative influences of the social media app, ranging from increased potential for self-harm, body dysmorphia, and cyberbullying, to exposure to conspiracy content and extremism.
Overview
Instagram for Kids is On the Way
One way or the other, Facebook has confirmed that Instagram for Kids (not the official name) is in the works. But it has also stated that the development of the product is very much in its early stages, meaning we know little of what to expect form Facebook in terms of additional protections, privacy, service costs, monetization techniques, and advertising.
What we do know is that the internal memo identifies two specific priority missions of the company: “(a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and (b) building a version of Instagram that allows people under the age or 13 to safely use Instagram for the first time.”
According to Facebook CEO Mark Zuckerberg, the company’s intentions for an Instagram for Kids center around working out issues that the current app has with underage audiences.
“There were clearly issues that need to be thought through and worked out, including how parents can control the experience of kids, especially kids under the age of 13,” Zuckerberg explained at the hearing. “And we haven’t worked through all of that yet.”
The US government is concerned about Facebook’s apparent lack of transparency towards crucial information such as how many of Instagram’s users are estimated to be under the age of 13, especially given the company’s track record regarding privacy and conspiratorial content, as well as cyberbullying.
During her opening statement in the virtual meeting, Republican Congresswoman Cathy McMorris Rodgers asked: “What will it take for your business models to stop harming children?” Perhaps this app is one of Facebook’s answers to that question.
Justified Worries and Concerns
An estimated 95 percent of teenagers have access to a smartphone, with the overwhelming majority of them logging the most hours on Snapchat, Instagram, and TikTok. In 2018, nearly half of all surveyed teens stated that they were online “almost constantly”.
That number has assuredly risen since the pandemic, at a time where teens have been all but encouraged to lead their social lives almost wholly through digital means. But all that connectivity comes at a steep price, according to various studies.
While only a quarter of teens in 2018 found that social media’s impact on their lives has been mostly negative, another Pew survey from the same year found that 59 percent of teens experienced one of the following forms of abusive behavior on social media:
- Offensive name-calling
- False rumors
- Receiving unsolicited explicit images
- Constant asking of where they are, what they’re doing, who they’re with by someone who isn’t a parent
- Physical threats
- Having explicit images of themselves shared without consent
Social media is a relatively young and volatile technology, and negative content is to be expected, especially at the sheer volume at which data is being produced and consumed on the Internet today. Unfortunately, teens and children with access to these apps and websites (often against the companies’ own Terms of Service) are most at risk of experiencing the negative impact of this content.
There’s more than just name-calling and unwanted messages on platforms like Facebook and Instagram. Other concerns include the rampant spread of misinformation and conspiracies, and the platform’s potential for radicalization.
In part, the decision to create a separate app for children may help protect some of them from the less savory elements of online social media without attempting to do the impossible, which would be to cut them off from a growing online social ecosystem.
Of course, there’s a clear incentive for the company here. An Instagram for Kids would help stave off some of the criticism headed for Facebook over the negative influence that its apps may be having on kids especially. Furthermore, it may given them an opportunity to funnel young users into their other networks when they become of age.
What This Might Mean for Parents
Now more than ever, being a parent and trying to protect one’s child from negative interactions in the outside world is incredibly difficult. Parents will continue to have to supervise and monitor how their children interact with the Internet and use social media, something Mr. Zuckerberg claims he does at home as well.
Facebook also suggested that a potential Instagram for Kids might expand parents’ options for monitoring their child’s use of social media, further empowering parents.
How Will Facebook Monetize an Instagram for Kids?
But perhaps the most crucial question is: what about the money? Facebook and most other free social media platforms monetize their data aggregation and advertising functionalities, allowing users to use their services for free while selling the ability to target and market to hundreds of millions of people.
But if Facebook is planning on creating a platform that specifically allows advertisers to target children on the Internet, it may be entering a regulatory maze. So far, the company has been unclear as to how they are planning to monetize an Instagram for Kids, or to what degree advertisers may be able to work with Facebook on such a platform.
The comparable YouTube Kids has been heavily changed to comply with the FTC’s children’s privacy law, with the onus largely being on content creators to correctly label and monetize their content according to whether it targeted kids or not.
In many cases, YouTube would designate content as “for kids” depending on viewership and the nature of the content and would strictly control engagement and monetization on such videos, even if that content wasn’t flagged as for kids by the respective creator.
Many videos with subject matter usually associated with children (including popular video games and comic book characters) may be intended for mature audiences. Furthermore, terms like “kid attractive” and “kid directed” are not at all clearly defined by the FTC. So far, YouTube is relying on machine learning to try and better distinguish between content intended for younger audiences, and content for more mature audiences.
What the content on an Instagram for Kids would look like, and how advertisers would be allowed on the platform, are unanswered questions for the near future. Keep an eye on this space.