Published September 21, 2023
Future generations will wonder why, when parents and legislators tried to use the democratic process to end Big Tech’s massive experiments on America’s children, the judicial system said “no.”
The experiment results are in — they are not good. In a recent advisory, the Surgeon General has declared a “national youth mental health crisis,” as both sexes report unprecedented levels of depression, loneliness, and anxiety — not to mention large spikes in the rates of self-harm and suicide among teens.
Eminent researchers, such as Jean Twenge and Jonathan Haidt, place much of the blame squarely on social media.
Parents have learned the hard way that private parental-control software — and the meager collection of tools that Big Tech companies halfheartedly toss at parents — are not equal to the problem. The former cannot even monitor certain apps like TikTok and Snapchat or certain features like direct messaging on Instagram, where minors spend most of their time.
Aiming to help parents regain control over what kids see and who talks to them online, many states, including Arkansas, have passed laws requiring social media platforms to get parents’ approval before kids enter into contractual account-holder relationships. These laws, following uncontroversial legal precedents such as requiring parental consent for tattoos or liability waivers, restore to parents what the Supreme Court has recognized as fundamental to our democracy: The power to control, even online, who educates our kids.
But rather than reaffirm parents’ rights, at least one federal district court seeks to condemn other people’s children to social media companies’ horrible experiments.
Judge Timothy Brooks of the Western District of Arkansas recently enjoined the Arkansas social media law on First Amendment grounds. His opinion displays a willful blindness towards the injuries these platforms cause.
Even worse, the Arkansas opinion depends on a naïve analysis of how social media today operates — the precedents it heavily relies on employ outdated understandings of the internet. However, if understood correctly, the state age verification laws he prevented from going into effect withstand such constitutional challenges.
To begin, Brooks’ opinion treats social media platforms as unsullied domains of personal growth. The opinion’s tone echoes that of Mark Zuckerberg himself as Brooks approvingly quotes Big Tech lobbyists who claim social media enables minors to “showcase their creative talents” and “raise awareness about social causes.”
But Brooks ignores social media’s more salient negative effects. For instance, in 2021 a whistleblower released thousands of internal Facebook documents to The Wall Street Journal, showing that the company knew much of its content was toxic for mental health and female body image, preferential to certain causes, designed to attract underage users, and preorganized by dense algorithms that acted invisibly and independently of user and community intention.
Brooks’ reliance on Brown v. Entertainment Merchants Association, the case that struck down a state law requiring parental consent for minors to purchase violent video games, would have made much more sense in 2011 when it was perhaps prudent to argue, as the Supreme Court did, that the evidence connecting violent video games to negative behavior was “not compelling.”
But it is anachronistic when applied to social media in 2023, the year that U.S. Surgeon General Vivek Murthy issued an advisory finding that these platforms have caused an “urgent public health issue” for young Americans. Contra Brooks, Murthy asserts that the indicators for social media presenting “a profound risk of harm to the mental health and well-being of children and adolescents” are “ample.”
Even more depressing, Brooks elevates the questionable and limited free speech rights of minors over parents’ right to raise their children, a precedent recognized for a century in cases such as Pierce v. Society of Sisters. In fact, Justice Clarence Thomas wrote in his dissent in the Brown case that “the ‘freedom of speech,’ as originally understood, does not include a right to speak to minors (or a right of minors to access speech) without going through the minors’ parents or guardians.”
There is certainly no First Amendment right for companies to contract with children in order to speak to them over their parents’ objections, but this is precisely what social media companies do.
Neither does the law “burden” adult speech. Raising the bogeyman of forcing Arkansans to “submit to biometric scans,” i.e., scans of one’s face, eye, or fingerprint, Judge Brooks ruled age verification a burden on adult speech. But, as we have previously argued, biometrics can be avoided altogether in favor of more secure methods.
Having a third party conduct verification using a two-step process, or, better yet, deploying more advanced cryptographic techniques like zero-knowledge proofs, allows for anonymous verification posing no burden to user privacy or adult speech whatsoever. These processes can take less than 60 seconds. It seems highly unreasonable for a one-time, minute-long process to be considered as inhibiting adult speech.
By treating the Arkansas social media law as a content-based restriction — a restriction on speech — rather than as a form of contract law governing the ability of minors to enter into online contracts and agreements, Judge Brooks’ court made a key category error. The process of opening an account to be able to participate in a social media platform involves agreeing to a whole host of terms and conditions, akin to forming a contract. And, as the logic of the law goes, parents have a right to determine who their child contracts with online.
Nothing in the Arkansas law places any restrictions on speech. This key distinction must not be lost on the court.
Similarly, the court erred in claiming the law’s definition of social media was “vague” when nearly identical definitions have been upheld by the U.S. Court of Appeals for the Eleventh and Fifth Circuits.
The law is not perfect. Judge Brooks raises important questions about its numerous carveouts, exempting platforms like Google’s YouTube, Snapchat, and WhatsApp. This was a mistake by the Arkansas legislature, and other states should take heed not to repeat it. Other parental consent laws, like Utah’s, do not contain these problematic exemptions and offer a stronger model for other states to follow.
For nearly 30 years, legislators and regulators have been trying to govern the internet without success, largely because of First Amendment jurisprudence. Given that the factual predicates and dated assumptions that undergirded these rulings have been proven spectacularly wrong, it is time for a fundamental reconsideration. The world that Big Tech promised is not the world it delivered.
At the end of the day, age verification is a small price for adults to pay to protect American children. Parents need help, and legislators are trying to give it to them. Now, we need courts to see the matter clearly — and come to their aid.
Clare Morell is Senior Policy Analyst at the Ethics and Public Policy Center, where she directs their Technology and Human Flourishing Project.
Clare Morell is a fellow at the Ethics and Public Policy Center, where she directs EPPC’s Technology and Human Flourishing Project. Prior to joining EPPC, Ms. Morell worked in both the White House Counsel’s Office and the Department of Justice, as well as in the private and non-profit sectors.