Published Decemberr 12, 2023
In the last year, Utah, Arkansas, Louisiana, and Texas passed landmark laws mandating age verification and parental consent for minors to access social media. We have worked with many state legislators to draft principles for this legislation. With America’s youth caught in social media addiction and internet-mediated experiences influencing every part of their lives, parents need tools to control who is talking to and influencing their kids in order to raise them. In response, some Big Tech companies have either lobbied against the bills or tried to carve out exceptions for themselves.
Because of the constitutional, privacy, and practical concerns in having users age verified by tech platforms, we also have argued that legislatures should ensure that “age verification…be both effective and capable of preserving user privacy.” Requiring users to give social media companies more personal information is akin to putting the wolves in charge of the chicken coop. Additionally, online anonymity would be threatened. This is why we have recommended that verification be conducted by a third party in a two-step authentication process, or even more securely by a third party using cryptographic methods like zero-knowledge proofs.
Another simple and straightforward solution that legislators should consider in their efforts to protect kids online is requiring age verification at the device level (as one of us has elaborated on in a recent policy brief from the Ethics and Public Policy Center and Institute for Family Studies). Smartphones serve as children’s main portals to the internet and social media platforms, and yet they have been unaddressed by any laws to date. According to a 2022 Pew Research study, 95 percent of teenagers have access to smartphones.
What’s more, the social networks most popular with America’s youth—TikTok, Instagram, Snapchat, and YouTube Shorts—are all designed to be apps primarily used on smartphones, all of which are accessible through the Apple or Google Play store. Until recently Snapchat was only available as an app on smartphones and tablets, and some other social networks still are. States wanting to go one step further could also tie device verification to additional device requirements, like automatically enabling filters for obscene web content.
Device verification could also make for a more seamless user experience for adults and reduce the potential burden of age verification on adult speech. Currently Apple (iPhone/App Store) and Google (Android/Google Play store) form a duopoly in the smartphone and app store markets, making up over 99 percent of the smartphone market. Most people are using one of these two companies’ devices to access social media. Rather than having users verify their age for every social media platform they want to access, Apple and Google could verify user age on the device once and the device verification could be integrated with social media platforms and other apps or websites with age thresholds, with the added benefit of increased protection of user privacy.
Google and Apple already have the capabilities to conduct such age verification. Both typically require credit cards, a form of age verification, to set up an app store account. Google already requires age verification when there is a change to the original Google ID birth date that would affect the adult status of the user, and Apple’s credit card application process conducts age verification on its device to set up the card. But these companies will not conduct and integrate device verification with other sites and social media platforms voluntarily, because then they would be helping to shoulder the responsibilities of age verification for their competitors, like Meta. They will have to be required to do so by law.
Requiring age verification at the device level, in addition to requirements for social media platforms and adult websites (since smartphones are not the only way that these sites are accessed), would help prevent two enormous companies from avoiding responsibility for their harms to kids. Google managed to exempt YouTube—the most popular app among teenagers—from the Arkansas social media law. Such carveouts should not be tolerated. Some age verification laws will apply to relatively small apps, so it is both fairer and more effective for the smartphone duopoly to have to pay their fair share and be held accountable for their negative effects on kids.
Meta itself, attempting to hold their competitors accountable, recently launched a campaign advocating for app store age verification and parental approval for app downloads for users aged 16 and under. While Meta may not be wrong that protections for children are needed on the devices and in their app stores, any device verification should be a complement to, not a substitute for, verification and consent on the platform level as well.
Device-level age verification, like all verification, would need to be done in a manner which prevents Google and Apple from maintaining or sharing user data. Once Google or Apple verifies age, the company should be required by law to delete any copies of documents, such as a driver’s license or credit card, used in the verification process. The device can instead save the user’s birth date as part of its own ID and generate a “cookie” or “token” from the verification process to subsequently use to communicate to age-gated apps, websites, and platforms that the user satisfies the required age threshold.
For too long, parents, churches, and schools have been fighting for tech platforms to be held accountable for their actions towards our children. Finally, lawmakers are making tools available to help parents protect their kids online. The state laws passed so far requiring parental consent for social media are a big step in the right direction. But legislatures must be smart and impose regulation in the right places, on the right parties, and in a fair way. It is time to open up another front to address the threats to child safety online. The smartphone manufacturers and app distributors that design and sell the main portals by which children access social media and the internet today must also be held responsible.
Clare Morell is a fellow at the Ethics and Public Policy Center, where she directs EPPC’s Technology and Human Flourishing Project. Prior to joining EPPC, Ms. Morell worked in both the White House Counsel’s Office and the Department of Justice, as well as in the private and non-profit sectors.