Published August 28, 2022
Around 2012, something began to go wrong in the lives of teens. Depression, self-harm, suicide attempts and suicide all increased sharply among U.S. adolescents between 2011 and 2019, with similar trends worldwide. The increase occurred at the same time social media use moved from optional to virtually mandatory among teens, making social media a prime suspect for the sudden rise in youth mental health issues.
However, social media use remains virtually unregulated among minors. So, given the federal government’s failure to reign in Big Tech’s influence on our children, it falls to the states to pass laws to protect our kids from the emotional and social fallout of unrestricted access to social media.
The failure stems mainly from U.S. Supreme Court decisions that limited Congress’ power to regulate the internet to protect children. In addition, the laws Congress has managed to pass — such as the Children’s Online Privacy Protection Act, which was supposed to allow parents to control the interaction between websites and children — have failed to exert any meaningful influence on children’s technology use.
In a new report, “Protecting Teens from Big Tech,” we detail six policies that state legislatures should pursue if they are serious about ending the epidemics of suicide, anxiety and depression ushered in adolescents’ unfettered access to social media.
These suggestions may be controversial, but the problem of teen mental health has become so concerning that bold measures are needed.
1. Enact age-verification laws
States could pass an age-verification law to require social media platforms to verify the age of any users in that state so that no minors under the age of 13 could create social media accounts. Under current federal law, the Children’s Online Privacy Protection Act prohibits internet platforms from collecting personally identifying information about children ages 13 and younger, making it the de facto age for social media. Increasingly, however, children younger than 13 are gaining access, and these younger children are more vulnerable to the harmful mental health effects. Age verification would help ensure the current age limit is effectively enforced.
2. Require parental consent for minors to open a social media account
States dissatisfied with the current de facto age of 13 for social media could take a further step. States could prohibit a social media company or website from offering any account, subscription service or contractual agreement to a minor under 18 years old, absent parental consent. When individuals join social media websites or use most commercial websites, they agree to terms of service, which are binding contracts, so it is a reasonable regulation that parental consent would be required.
3. Mandate full parental access to minors’ social media accounts
States could also pass laws requiring social media platforms to give parents or guardians full access to all social media accounts created by minors between the ages of 13 and 17. Full access would ensure that parents have control of their minor child’s account settings so they can restrict its privacy, review friend requests and know exactly what their child is doing online.
While parents can currently utilize various for-purchase parental control apps, certain platforms, like TikTok, are not able to be covered, or parents are unable to fully monitor all aspects of the account. Government intervention is needed to provide full access, and to empower all parents, not just those able to afford a private option.
4. Enact a complete shutdown of social media platforms at night for kids
States could also pass a law requiring social media companies to shut down access to their platforms for all 13- to 17-year-olds’ accounts in that state during bedtime hours. Minors would not be able to access social media from, for example, 10:30 p.m. to 6:30 a.m., to align with usual nighttime sleep hours and eliminate teens’ temptation to stay up late on social media. This is an important step to take because technologically induced lack of sleep is a primary driver of depression among teens.
5. Create causes of action for parents to seek legal remedies with presumed damages
Any law that a state passes to protect kids online should include a private cause of action to enable parents to bring lawsuits on behalf of their children for any violation of the law. These companies aim to maximize profit, so there must be a sizable enough threat in order for them to correct their behavior.
6. Enact a complete ban on social media for those under age 18
Many states already place age restrictions on numerous behaviors known to be dangerous or inappropriate for children, such as driving, smoking, drinking, getting a tattoo and enlisting in the military. Similarly, a state could recognize social media as a prohibited activity for minors.
Social media use by even a few children in a school or organization creates a “network effect,” so even those who do not use social media are affected by how it changes the entire social environment. A collective solution is needed. An across-the-board age ban would place the burden where it belongs: back on the social media companies that designed their platforms to be addictive, especially to the most vulnerable: children.
The federal government has not moved clearly and forcefully to address the harms posed by Big Tech to American teens. From surging rates of depression to suicide, American adolescents — and their families — are paying a heavy mental and emotional price for their use of social media. Thus, it falls to the states to step into the breach.
One day, we will look back at social media companies like ByteDance (Tiktok) and Meta (Facebook and Instagram) and compare them to tobacco companies like Philip Morris (Marlboro) and R.J. Reynolds (Camel).
For a time, Big Tobacco enjoyed immense profits and popularity. But eventually, the companies were held accountable. We are living at a moment when we are just learning of the social and psychological harms of social media. It now falls to a few pioneering states to inaugurate a new era of regulatory reform for Big Tech.
Clare Morell is a Policy Analyst at the Ethics and Public Policy Center, where she works on EPPC’s Technology and Human Flourishing Project. Prior to joining EPPC, Ms. Morell worked in both the White House Counsel’s Office and the Department of Justice, as well as in the private and non-profit sectors.
Jean M. Twenge is a professor of psychology at San Diego State University and the author of “iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy — and Completely Unprepared for Adulthood.” Brad Wilcox is director of the National Marriage Project at the University of Virginia and The Future of Freedom fellow at the Institute for Family Studies.