Will Congress Seize an Opportunity to Protect Children?


Published January 25, 2022

National Review Online

Will 2022 be a “do or die” moment for Congress to pass Big Tech legislation before the midterm elections? The prospects for Congress’s doing anything in the next few years on Big Tech look dim. Divisions within and between the parties on such matters as antitrust and Section 230 seem at the moment to be prevailing over whatever common ground is forming in those areas. With congressional action looking highly unlikely in either area, will Congress pass any legislation at all to counter Big Tech and its harms this coming year?

One cause for optimism is the growing possibility of amending the Children’s Online Privacy Protection Act (COPPA). In light of growing bipartisan agreement on how social-media platforms are harming our children, a legislative path forward in 2022 would be for Congress to amend this act.

COPPA, passed in 1998, currently purports to regulate the Internet platforms’ power to obtain, without parental permission, personal information about our children. But as written, it does little to protect children’s privacy. The statute in fact gives the Big Tech platforms carte blanche to collect personal information about children. With the power to collect limitless data on children, the platforms have greater ability to tailor content to children’s interest and preferences, further addicting them to screens. COPPA has failed utterly to empower parents to break this cycle. Giving parents control over their children’s online habits should be a central goal of technology policy, particularly because there is ever-increasing acknowledgement by both parties of the harms of Big Tech against our children.

Overall, COPPA is limited in its scope and lacks serious enforcement, rendering it useless in protecting children. In order to give it teeth, Congress should amend COPPA in the following five ways: (1) change the definition of a minor from under 13 years old to under 18 years old; (2) lower the liability standard from “actual knowledge” to “constructive knowledge”; (3) include in “personal identifying information” any data obtained from collecting information from tracking children’s Internet use; (4) allow for state private causes of action under state tort law; and (5) provide a private right of action.

First, in its original form, COPPA was intended to protect children under the age 16, but at the last minute, lobbying interests pressured legislators to change the effective age to 13. Given the well-documented harms, particularly to minors, that social media cause, including increased depression, suicide, mental illness, self-harm, and loneliness, the platforms should not be able to collect information from minors without parental consent. Amending COPPA to make the age 18 would allow underage children on only social-media platforms with verifiable parental consent — returning to parents control over their children’s well-being and mental health.

As Jonathan Haidt has also pointed out, parents have trouble controlling kids’ screen use because of social-media platforms’ “network effect.” It is difficult for parents to bar kids from platforms that all their friends are using — or indeed, as with Facebook, from platforms their schools and teachers are using for educational purposes. By requiring parental consent for a minor under 18 to create an account with any online platform that obtains information from minors, parents could regain control over their children’s use.

Second, COPPA currently only covers platforms that have “actual knowledge” that one of their users is underaged. This is the highest liability standard in law. It requires that a plaintiff show that the platform’s corporate organization as a whole had specific and certain knowledge that unauthorized, underaged individuals were using its platform. Because the standard is nearly impossible to prove in a court of law, COPPA has proved largely toothless. The Federal Trade Commission (FTC) made much-needed updates to the COPPA rule in 2013, but they have had little to no effect because of this persisting “actual knowledge” standard. Changing the standard to “constructive knowledge” would have a real impact to make platforms responsible for what they “reasonably should know,” given the nature of their business and the information they already collect from their users.

Third, COPPA was passed in 1998, a time at which the platforms could not track online behavior to the degree they now do. As a result, its definition of personally identifying information is quite limited, including only typical categories such as name, address, and email. Given that social media and other websites now obtain the most intimate information about children by tracking their Internet viewing habits, personally identifying information in COPPA should be expanded to include the storing or analysis of data derived from children’s Internet usage.

Fourth, Congress should amend the statute to eliminate COPPA’s preemption provision. Under the current version of the statute, COPPA preempts state torts alleging invasion of privacy. By displacing state privacy actions in favor of its toothless protections, COPPA arguably has been worse than nothing.

Fifth, there is no private cause of action in COPPA. The FTC can bring a deceptive-trade-practice enforcement action. But a private cause of action, with perhaps the express possibility of class actions, would wield a larger, sharper sword in defense of children’s privacy. It would give power to parents to take these companies to court and hold them accountable if they have not effectively kept minors off their platforms.

The legislative path forward for Congress in 2022 is COPPA. Senators Ed Markey (D., Mass.) and Bill Cassidy (R., La.) have already introduced one bipartisan proposal for amending COPPA, which incorporates some of these necessary reforms. But it could be stronger — much more could be done. Republicans and Democrats should come together to amend this law in the ways we have outlined above and in doing so make a real difference in protecting our children from the harms of Big Tech.

Clare Morell is a policy analyst at the Ethics and Public Policy Center, where she works on EPPC’s Big Tech Project. Prior to joining EPPC, she worked in both the White House Counsel’s Office and the Department of Justice, as well as in the private and nonprofit sectors. 

Adam Candeub is Professor of Law at Michigan State University and Senior Fellow at the Center for Renewing America. He was previously Acting Assistant Secretary of Commerce for Communications and Information. 


Clare Morell is a fellow at the Ethics and Public Policy Center, where she directs EPPC’s Technology and Human Flourishing Project. Prior to joining EPPC, Ms. Morell worked in both the White House Counsel’s Office and the Department of Justice, as well as in the private and non-profit sectors.

Most Read

EPPC BRIEFLY
This field is for validation purposes and should be left unchanged.

Sign up to receive EPPC's biweekly e-newsletter of selected publications, news, and events.

SEARCH

Your support impacts the debate on critical issues of public policy.

Donate today

More in Technology and Human Flourishing