1. Why do people need to be 13+ y.o. to use social media?
2. Why do companies create products for kids?
3. So is technology good or bad for kids?
Why do kids need to be 13+ y.o. to use social media?
Many internet services we use daily become available at 13. The reason is COPPA law. At the dawn of e-commerce in 1998, US Congress introduced COPPA to regulate what personal information websites can collect about users under 13. COPPA is interesting for two reasons: 1) it's somewhat outdated to regulate the fast pace tech industry; 2) few people understand this age. From WSJ:
Parents might think of the age-13 requirement as a PG-13 movie rating: Kids might encounter a bit more violence and foul language but nothing that will scar them for life. But this isn't an age restriction based on content. Tech companies are just abiding by a 1998 law called the Children's Online Privacy Protection Act (COPPA)… But it has inadvertently caused 13 to become imprinted on many parents' psyches as an acceptable age of internet adulthood.
I bet you know kids younger than 13 who use social media. Companies may know about it too. For instance, TikTok knew it had underage users and was fined $5.7 million. My assumption - companies weigh risks against user growth. An extra step for age verification will catch underage users and filter out less engaged users. New user signup is a critical metric for a product, so that the fine may be worth it.
Age verification is interesting on its own. You'd think that social media algorithms that know the color of the fork you want to buy will identify underage users quickly. Nope. Algorithms perform well once a person does something on a platform (likes, personal info, etc.). Recently, Facebook started analyzing user activity to identify underage users.
Why do companies create products for kids?
Kids are the new audiences for companies that hit a user growth plateau. Products want to become a one-stop shop for all family communication or entertainment. Here is an excerpt fromYouTube for Kids announcement:
This is the first step toward reimagining YouTube for families, but with your help, the app will continue to get better over time...
Most tech companies started as adult-only services. Eventually, they look at the kids market to capture future users: capture attention early → get them used to a product → when they grow up, a product becomes a default choice. That's one of the reasons Instagram is concerned with TikTok, which captures the younger generation's attention.
This playbook is similar to how companies create products for students. Students are more likely to continue using a familiar product when they start working. The earlier people use a product, the better.
Let's also talk about product marketing. Companies rely on ads to promote products - but how do ads reach users that can't be targeted with ads directly? I asked my friend Denys Shamatazhy who is responsible for Ads at VK. While you can't select an age below 13 (or 18 in some countries), it's relatively easy to run parent-centric and kid-centric ads.
With parent-centric ads, you can use the 'parental status' feature. First tho, users need to provide their 'parental status.' For kid-centric ads, you can target based on interest. For example, use game advertisement for an audience enjoying gaming, tech gadgets, etc. You don't target underage users, but they see an ad if they use a platform and share this interest. That's most likely why and how YouTube Kids injects promos based on video categories but restricts further purchase flows.
So is technology is good or bad for kids?
It's ironic that adults can't figure out whether social media is good or bad. We know it connects people, but it polarizes us. We want to stay updated with news, but this creates a stream of misinformation. On top of this, we (adults) need to decide if social media is suitable for kids.
Prohibiting kids from using technology can backfire and prompt kids to use tech secretly. We need to ensure that companies who create products for kids use a business model that will incentivise them to keep kids and their data safe. We saw examples when companies with ad business and engagement-driven platforms aren't the best solution. US Senate is now introducing an updated act, which I find interesting cause it calls out specifics of how platforms work (likes, push notifications, etc.)
I have more questions than answers myself, but I find those questions intriguing since they help me see the full picture of the challenge. What's your take?
Digital Hygiene Recommends
When companies identify a child account, some personalise their experience to keep them safe on the platform. For instance, keep content private by default and turn off auto-play by default. Here is more from YouTube and parents guide from TikTok.
Protocol has an article (feels a little like Apple promo) on how hardware companies may be better equipped to collect and verify user age. They can then prevent access to certain websites on a device level. For example, don't even show apps on the store that don't fit the age.
Alex Stamos (former security officer at Facebook) proposed similar ideas and shared how his family handles tech for kids.
Last but not least, I want to congratulate Kabosu and express my appreciation for being a go-to meme reply for almost all my chats. Happy birthday, Kabosu 🥳