Social Media and Harm to Children


Published August 31, 2023

EPPC

1. Severity of the Problem

We have a severe public health crisis on our hands: America’s children and teenagers are literally dying from social media. They are more depressed and anxious than ever before. New data from the CDC shows that nearly 3 in 5 teen girls felt persistent sadness in 2021 . . . and 1 in 3 girls seriously considered attempting suicide. The Wall Street Journal recently reported that for the first time in 15 years the mortality rate for 0–19-year-olds actually increased two years in a row. For decades, advances in healthcare and safety drove down death rates among American children. In an alarming reversal, rates have now risen to the highest level in nearly 15 years, in large part driven by suicides and drug overdoses. Researchers say “Social media has helped fuel this by replacing successful relationships with a craving for online social attention that leaves young people unfulfilled, and exposes them to sites that glamorize unhealthy behaviors such as eating disorders and cutting themselves.”

We now live in a country where 11-year-olds are committing suicide. How can a child feel so hopeless at the age of 11­­—with their whole life in front of them—that they think life isn’t worth living?

Many of these problems trace back to the root design of Big Tech platforms. We have to understand that Big Tech is a predatory industry, like casinos, alcohol, and tobacco. Its products are designed to addict and exploit our children and their brain’s vulnerabilities. They want their users to be addicted. They are not looking out for the user’s well-being. Rather, they prey on human vulnerabilities, especially those of children, in order to maximize their profits. They do this by seeking to extract as much time, attention, and data, as possible and sell it to advertisers. They design their “free” products to be maximally addictive. As a result, studies show that kids’ brains are literally being rewired by social media and its dopamine effects. Children’s attention spans and ability to focus for longer periods of time are declining. ADHD and autism are on the rise. Dr. Victoria Dunkley, who wrote the book “Reset Your Child’s Brain,” would require that patients undergo a complete digital detox for 4–6 weeks before she would begin treating them for ADHD or autism. She found that the majority of symptoms would resolve on their own just by completing a 4–6-week digital detox. In some cases, the symptoms resolved entirely, because they were not actually underlying ADHD or autism, but rather screen-induced symptoms that mimicked those conditions.

Beyond the physical and mental health crisis social media is creating among our youth, the content available and bad actors present on social media platforms are both very dangerous to children. TikTok and Instagram have been shown to send teens down rabbit holes of eating disorder content and sexual content. Pornography is everywhere, and it’s not just on PornHub and other adult sites—it’s also on YouTube, Spotify, Pinterest, Instagram, Twitter, and SnapChat.

Predators are all over social media, trying to befriend and groom young girls. A private study found that nearly 1 in 3 teen girls have been approached by adults asking for nudes on social media. The presence of teens on social media presents not only a mental health concern, but also a safety concern.

The problem is severe on several levels, from the design of the apps and rewiring of kids’ brains, to the severe mental health effects, the presence of pornographic and inappropriate content, and the predators and cyberbullies eager to exploit vulnerable children.

All of this leads to a larger problem: social media and smartphones have put our country on a trajectory toward civilizational crisis. We are allowing an entire generation to grow up online. They have become dopamine robots. They don’t know how to form real-life relationships. The ever-present distraction and escape mean they don’t have to confront real-life disappointments and emotions. They aren’t building the skill of resiliency. They are never bored and can always be entertained, neutering their natural abilities to be imaginative and creative. We are losing what it means to be human. And what does this mean for the future of marriage and family formation, the building blocks of civilization? They are becoming utterly destabilized.

Sadly, current federal law has been unable to address the myriad dangers that social media presents to our children.

The Children’s Online Privacy Protection Act (COPPA) of 1998 was supposed to allow parents to control the interaction between websites and children, but due to several loopholes, it has been largely ineffective. It set the de facto age for social media to 13, which is much too young. And because of its knowledge standard it has been very difficult to enforce against social media companies for allowing minors under even that low age of 13 on to their platforms. We know 9–12-year-olds are all over these apps without consequence.

Section 230 was meant to not only be a shield from liability for internet companies but also a sword against illicit content, empowering platforms to remove content like pornography to protect children. However, bad court rulings have unreasonably expanded Section 230 to protect social media companies from liability even if social media companies know of unlawful content its users are distributing and fail to take it down. The companies are protected for removing unlawful content but there is no penalty for them if they don’t. Section 230 is all carrot and no stick when it comes to preventing harm to children.

3. A Collective (Government) Solution Needed

Thus, the current state of affairs means that the burden rests solely on parents to try to protect their children online. While some think this is enough, the reality is that parents increasingly have limited control over and insight into what their kids see and do on social media. Even the best private parental control software can’t give parents access to everything. For example, TikTok and Snapchat don’t allow third-party parental control apps access, and Instagram doesn’t allow access to direct messages, where a lot of dangerous activity happens.

Furthermore, the problem of social media is not a private one. It’s been shown that social media use by even a few children in a school or organization creates a “network effect,” so even those who do not use social media are affected by how it changes the entire social environment. For example, if all the teens in a class are interacting and socializing through Instagram and a few in the class are not on Instagram, those few can still experience negative impacts of social media indirectly, feelings of loneliness and isolation, depression, and anxiety. The harms of social media need not flow through individual users, but can affect children by changing their peer social dynamics. Even if parents choose to fight the difficult individual battles to keep their child off these platforms, it is not enough to shield them from all its effects.

Finally, not every child comes from a good home with loving, involved parents who are trying to shield them from the harms of social media. There is actually a screen-time disparity. A survey in 2019 found that kids from lower income homes (less than $35,000) spend on average 8 hours and 32 minutes a day on screens, which is on average about 2 hours more than kids from high-income families (over $100,000) who spend more like 6.5 hours a day.

There are times when society recognizes that something is so harmful and dangerous to children and society itself, that it shouldn’t be left to the sole discretion of individual parents. Presented with dangers like alcohol, smoking, gambling, or driving cars, we rightly set age limits and put meaningful protections in place. For the sake of both children and the common good, the nature and severity of the dangers posed by social media require such a collective solution.

4. State Legislation

Recently, states have begun to step up. Utah, Arkansas, Louisiana, and Texas have all passed robust social media bills into law this past year. All of these bills draw on contract law to require parental consent for minors under 18 in their states to form social media accounts, since creating a social media account and agreeing to terms of service is akin to entering a contract. Utah also requires full parental access to minors’ accounts for effective oversight, as well as an overnight shut down of social media from 10:30 PM to 6:30 AM, and requires companies to treat minor accounts differently than adult accounts by limiting their appearance in search results, disabling direct messaging with accounts that aren’t “friends,” preventing the collection of minors’ data, and prohibiting targeted advertising and targeting or suggesting groups, products, services, posts, or accounts to minors. The law also creates a private right of action for parents to bring suits for violations and for harms caused to their children from social media. These are strong steps, but states can only go so far on their own without the help of Congress.

5. Solutions for Congress

There are some protections Congress alone can enact. I will briefly outline a few of these solutions:

  • Raise the age for social media to 16—or even better, 18. This could be done by updating COPPA, or by employing a separate vehicle. An across-the board, enforceable age limit would place the burden where it belongs: on the social media companies themselves. Age-limits have a long precedent and would empower parents to make social media for young teens a non-issue, freeing them from the pressure to give consent for kids to form an account.
    Along with raising the age, we need a federal age-verification solution. The success of any changes and proposals will hinge on an effective and secure way to verify age.
    The age limit also needs to be more enforceable by changing the knowledge standard from actual to constructive. This would involve empowering state attorneys general and/or creating a private right of action for parents so enforcement is not all on the Federal Trade Commission (FTC).
  • Require parental consent and mandate complete administrator-level parental access for any minor social media accounts or any online account. Parents should have full access to minor’s communication channels and the right to consent to them forming one. This would also help get rid of the issue of secret or second accounts that parents don’t know about.
  • Give parents an online right to protect their children by mandating that online companies interoperate with and give access to their software to third-party software, parental control apps, etc. This will allow for effective private-sector solutions.
  • Reform Section 230 to add a Bad Samaritan carveout. Attorney General Barr recommended this to Congress in the Department of Justice’s 2019 Section 230 proposal so that companies no longer can receive Section 230 immunity if they are knowingly distributing criminal content.
  • Require age-verification for pornography websites. Last Congress, Senator Lee introduced an age-verification for porn sites bill that would require major porn sites, often linked to through social media, to verify and ensure that a user is over 18. This should be reintroduced. This would directly challenge the Ashcroft precedent, but 20 years later, with the nature of the internet so changed, it is worth challenging.
  • Regulate the online porn industry. Again, last Congress, Senator Lee introduced the PROTECT Act which would require porn sites to verify the age, identity, and explicit consent (signed consent form) of every individual appearing in uploaded material. This would protect against online child sexual exploitation, prevent traffickers from profiting from non-consensual abuse, and protect victims of revenge porn. This approach goes after the supply side.
    Any bill should include a strong enforcement mechanism by giving parents a private right of action to bring lawsuits on behalf of their children against tech companies for any violation of the law. And empower state attorneys general to bring lawsuits so it is not all up to the FTC to enforce these bills. These companies aim to maximize profit, so there must be a sizeable enough threat to their profits for them to correct their behavior and follow the law.

Lastly, a bold idea:

  • Regulate smartphones. Increasingly, in my research, smartphones are at the root of many of the problems posed by social media and online porn, because of the constant access they provide to these platforms and websites. Social media would not be nearly as dangerous if it were only accessed at certain, limited times from a computer. A recent study by Sapien Labs found that adults who acquired their first smartphone at younger ages are now worse off in measures of mental well-being than those who acquired smartphones at a later age. Smartphones are clearly a technology that require maturity and training to use safely. We have precedents for this with other technologies, such as the car. Recognizing children were not mature enough to operate automobiles, we imposed an age requirement for driving a car, and a licensing system that requires proper training and skills before operating one. Now that we see the dangers to children from smartphones, the government could similarly regulate their use and ownership by minors, such as creating an age limit for purchase and ownership and requiring a class or certification in order to be able to own one. At the very least the government could help incentivize strict regulations on smartphones and phones during the school day, by tying public school funding to a requirement that no phones be allowed during the school day or on the school grounds. Furthermore, regulations could be imposed on smartphone manufacturers and app store operators to make these devices and their apps safer for children. There are regulations to safeguard children from harmful toys, food, playgrounds, medications, furniture, and clothing. But there is a concerning lack of regulations for devices on which children spend hours every day. Simple regulations like imposing requirements for the app store age rating systems, requiring app risk disclosures to be accurate and visible for parents, and prohibiting in-app advertisements from promoting mature content to children would all make a difference.

6. Current Bills

  • Protecting Kids from Social Media Act, introduced by Senator Tom Cotton (R-AR), Senator Brian Schatz (D-HI), Senator Chris Murphy (D-CT), and Senator Katie Britt (R-AL), would ban social media for all children under the age of 13, require companies to do meaningful age-verification, obligate parental consent for any minor under the age of 18 to form an account (consent that can be revoked at any time), and ban the use of algorithms to recommend content to a minor’s account. This bill would be a very strong step in the right direction to give parents final authority over their children’s social-media use. It would also prohibit social-media companies from using algorithms to feed content to users under the age of 18. Another critical strength of the Schatz–Cotton bill is that it would require social-media companies to conduct age-verification of its users. The bill would create a government pilot program to develop a secure method of age verification via a new “digital identification credential,” which will be designed to attest to a user’s age without requiring that he or she provide Big Tech companies with any underlying identifying information or government-issued ID. The bill stipulates that Big Tech’s participation in this program is, however, voluntary. The bill provides several avenues for enforcement, which would make it truly effective in ways COPPA has not been. The bill not only empowers the FTC but also state attorneys general to ensure Big Tech’s compliance on behalf of their residents. State attorneys general are often nimbler and can act faster than a large government agency such as the FTC.
  • Kids Online Safety Act, by Senator Blumenthal (D-CT) and Senator Blackburn (R-TN) aims to make social media platforms design their products with children’s safety in mind. The bill if passed would require that social media platforms provide minors with options to protect their information, disable addictive product features (like autoplay and other notifications), and opt out of algorithmic recommendations. Platforms would also be required to enable the strongest settings by default. It also would give parents new controls to help support their children (and enable those by default) and identify harmful behaviors, and provide parents, schools, and children with a dedicated channel to report any harms to kids on the platform. The bill also creates a responsibility for platforms to prevent and mitigate a list of specific harms to minors in their design and operation of their products, services, and features, such as promotion of suicide, eating disorders, substance abuse, sexual exploitation, and unlawful products for minors (e.g. gambling and alcohol). KOSA is a strong bipartisan solution and passed out of the Commerce Committee unanimously. It now awaits a floor vote.
  • COPPA 2.0, by Senator Markey (D-MA) and Senator Cassidy (R-LA) would update COPPA, originally passed in 1998, in the following ways: it would build on COPPA by prohibiting internet companies from collecting personal information from users who are 13 to 16 years old without their consent; ban targeted advertising to children and teens; revise COPPA’s “actual knowledge” standard, covering platforms that are “reasonably likely to be used” by children and protecting users who are “reasonably likely to be” children or minors; create an “Eraser Button” for parents and kids by requiring companies to permit users to eliminate personal information from a child or teen when technologically feasible; establish a “Digital Marketing Bill of Rights for Teens” that limits the collection of personal information of teens; and establish a Youth Marketing and Privacy Division at the FTC. Changing the knowledge standard for COPPA would make it easier to bring cases against the tech companies for violations. And while updating the age is good, unfortunately this bill would treat the new 13 to 16-year-old age group differently. Parental consent is not required for this group like for under 13 year-olds, rather they are capable of giving their own consent. This means the changes would not cause social media companies to raise the age of social media, since parental consent is still tied only to the age of under 13. COPPA 2.0 also passed out of Commerce Committee and awaits a floor vote.

Clare Morell is a Senior Policy Analyst at the Ethics and Public Policy Center, where she directs EPPC’s Technology and Human Flourishing Project. Prior to joining EPPC, Ms. Morell worked in both the White House Counsel’s Office and the Department of Justice, as well as in the private and non-profit sectors.

Most Read

EPPC BRIEFLY
This field is for validation purposes and should be left unchanged.

Sign up to receive EPPC's biweekly e-newsletter of selected publications, news, and events.

Upcoming Event |

Roger Scruton: America

SEARCH

Your support impacts the debate on critical issues of public policy.

Donate today

More in Big Tech Project