Protecting Kids Online: Requiring Age Verification Through App Stores
A bipartisan group of senators is about to take Big Tech CEOs to task on Jan. 31, 2024, by having them publicly address their failures to protect kids online. And the CEOs need to! The harms social media poses to children are well documented and, at this point, indisputableโeven by the companies themselves.
YouTube admits that it hosts harmful content for children and even calls for legislation to address the problems it helps create. YouTubeโs CEO indicated as much when he published his โprincipled approach for children and teenagers.โ
In it, he writes that YouTube โcanโt do it alone โฆ [and] support[s] policymakers, families, researchers, companies, and experts coming together to define a set of consistent standards for companies serving young people online.โ He then advocates for policies that provide more parental rights, age-appropriate content for minors, and โappropriate safeguards.โ
Even with this call for legislation, policymakers are finding it hard to impose basic measures, such as age restriction requirements to protect kids using these services.
Age restrictions on a deliberately addictive product that targets children should be a no-brainer. Various levels of government impose age restrictions on these types of offerings all the time. For example, kids canโt get tattoos or piercings without parental consent. Or attend movies that are rated R or PG-13. Or purchase video games rated โM.โ But somehow, social media services are distinct, even though they trigger a similar addictive response as nicotine withdrawal and gambling.
So, why havenโt we done it?
State legislatures are making unforced errors by not considering the full internet stackโe.g., the operating system, the app store, or the appโwhen imposing such measures. Arkansasโ social media law is a prime example of an unforced error. Its law would have required some, but not all, social media companies to verify the age of their users. The law didnโt apply to YouTube, for example.
Failing to take a holistic approach when crafting legislation for the social media market unnecessarily opened the state up to a First Amendment challenge. This is because for the state to require such a measure, it would have to show that the requirement is both โnarrowly tailoredโ to not impact adult usersโ speech and also necessary to address the harms alleged.
According to federal District Court Judge Timothy Brooks, the state failed because excluding some social media companies made little sense if the lawโs stated goal is to protect kids from the harms social media imparts. Brooks pointed out that even though โYouTube is not regulated by [Arkansas social media law],โ the state oddly cited YouTube as being particularly harmful to children and โ[a]mong all types of online platforms, YouTube was the most widely used by children.โ
So, if Arkansasโ law doesnโt apply to YouTube, how can the state justify an imposition on adult speech on services that arenโt even favored by most children?
But itโs not all bad news. Brooks may have provided legislatures a path forward to get age verification without implicating the First Amendmentโby going through the app stores.
Brooks seemed to be more upset that Arkansasโ law imposed duties on social media companies rather than on Google and Apple, the companies that run two of the biggest app stores on the internet. Throughout his opinion, he noted that Apple and Google provide parents several tools to shield kids from certain apps and wondered why the state needed to go through social media platforms to do what the app stores ought to.
Heโs got a point. Social media apps rely on users to self-certify their age when they create an account, but they canโt fully confirm that the user is being truthfulโwhereas Apple and Google know the precise age of the owner of the device. How? Well, their software is integrated at every level of the mobile device. They own the two dominant mobile operating systems (i.e., iOS and Android, respectively), app stores (i.e., the App Store and Play Store), and browsers (i.e., Safari and Chrome).
Candidly, not including Apple and Google in age verification legislation makes little sense, because doing so would almost certainly resolve the First Amendment concern.
Why? Because all the law would have to require is for Apple and Google to give the app a thumbs up or thumbs down when a social media app asks to verify the device is owned by an adult or a child. The social media company would no longer have to guess how old the user is, which limits, or even eliminates, the risk of denying an adult the ability to engage on American social media platforms.
Whatโs more, going through the app stores makes parental consent more doable, because, as the court noted, parents have more control over the device than the apps themselves and the app stores are already required to obtain express parental consent for in-app purchases to comply with consent decrees from the Federal Trade Commission.
If the childโs deviceโs app store prohibits the child from downloading apps without a parentโs credentials (e.g., entering a password, providing a fingerprint, or using facial recognition), then parents immediately get more agency over the apps their children can access.
Better yet, it wouldnโt require the consumer to provide more information about the user to social media companies. If social media companies could simply send a request to the device (leveraging the data already available from the deviceโs operating system) to confirm (almost instantaneously) that a person is above the age of 18, then two things occur: 1) The user doesnโt provide more data other than what he has already provided to Apple or Google; and 2) the social media company only gets a thumbs up or a thumbs down in response to its inquiry, which limits the amount of new information the company receives.
State legislators should learn from these mistakes in order to effectively protect kids online. The biggest lesson so far is to look to both the apps and the app stores when it comes to age restrictions.