California forges ahead with social media rules despite legal barriers - Los Angeles Times
Advertisement

California lawmakers continue push to regulate social media despite legal hurdles

A person holds a smartphone with icons on the screen
Legislation being considered in California would prohibit social media companies from sending children notifications at night and during school hours.
(Matt Cardy / Getty Images)
Share via

California lawmakers are pursuing legislation aimed at protecting children from the dangers of social media, one of many efforts around the country to confront what U.S. Surgeon General Vivek Murthy and other public health experts say is a mental health emergency among young people.

But California’s efforts, like those in other states, will probably face the same legal challenges that have thwarted previous legislative attempts to regulate social media. The tech industry has argued successfully that imposing rules regulating how social media operate and how people can use the online services violates the free speech rights of the companies and their customers.

A previous effort at confronting the issue, the California Age-Appropriate Design Code Act in 2022, now rests with the U.S. Court of Appeals for the 9th Circuit. A tech trade association sued to block the law and won an injunction from a lower court, largely on 1st Amendment grounds. The appeals court heard oral arguments in the case on Wednesday.

Advertisement

“At the end of the day, unconstitutional law protects zero children,†said Carl Szabo, vice president and general counsel for NetChoice, which argued for the tech giants before the federal appellate court.

Like the design code act, the two proposals now working their way through the California Legislature would reshape the way social media users under 18 interact with the services.

The first bill, by state Sen. Nancy Skinner (D-Berkeley), prohibits sending push notifications to children at night and during school hours. Skinner’s measure also requires parental permission before platforms can send social media offerings via algorithms designed to keep people looking at their phones.

Advertisement

The second measure, by Assemblymember Buffy Wicks (D-Oakland), would prohibit businesses from collecting, using, selling or sharing data on minors without their informed consent — or, for those under 13, without their parents’ approval.

Both bills have bipartisan support and are backed by state Atty. Gen. Rob Bonta. “We need to act now to protect our children,†Bonta said earlier this year, by “strengthening data privacy protections for minors and safeguarding youth against social media addiction.â€

California Gov. Gavin Newsom, a Democrat, has been vocal about youth and social media and recently called for a statewide ban on cellphones in schools. He hasn’t publicly taken a position on the social media bills.

Advertisement

California’s efforts are especially significant because its influence as the most populous state often results in standards that are adopted by other states. Also, some of the big tech companies that would be most affected by the laws, including Meta, Apple, Snap and Alphabet, the parent company of Google, are headquartered in the state.

“Parents are demanding this. That’s why you see Democrats and Republicans working together,†said Wicks, who, with a Republican colleague, co-wrote the design code act that is tied up in litigation. “Regulation is coming, and we won’t stop until we can keep our kids safe online.â€

From the sale of deadly drugs to child sexual abuse images, social media can pose dangers. Lawmakers are targeting the platform’s algorithms, designs and features amid calls to hold tech platforms accountable for safety risks.

The fate of the design code act stands as a cautionary tale. Passed without a dissenting vote, the law would set strict limits on data collection from minors and order privacy settings for children to default to their highest levels.

NetChoice, which immediately sued to block the law, has prevailed in similar cases in Ohio, Arkansas and Mississippi. It is challenging legislation in Utah that was rewritten after NetChoice sued over the original version. And NetChoice’s lawyers argued before the U.S. Supreme Court that efforts in Texas and Florida to regulate social media content were unconstitutional. Those cases were remanded to lower courts for further review.

Though the particulars differ in each state, the bottom line is the same: Each of the laws has been stifled by an injunction, and none has taken effect.

“When you look at these sweeping laws like the California laws, they’re ambitious and I applaud them,†said Nancy Costello, a clinical law professor at Michigan State University and the director of the school’s First Amendment Clinic. “But the bigger and broader the law is, the greater chance that there will be a First Amendment violation found by the courts.â€

Advertisement

The harmful effects of social media on children are well established. An advisory from Surgeon General Murthy last year warned of a “profound risk of harm†to young people, noting that a study of adolescents 12 to 15 found that those who spent more than three hours a day on social media were at twice the risk of depression and anxiety as nonusers. A Gallup survey in 2023 found that U.S. teenagers spent nearly five hours a day on social media.

In June, Murthy called for warnings on social media platforms like those on tobacco products. Later that month came Newsom’s call to severely restrict the use of smartphones during the school day in California. Legislation to codify Newsom’s proposal is working its way through the Assembly.

Meta CEO Mark Zuckerberg speaks to the audience during a Senate Judiciary Committee hearing on child safety.
Meta Chief Executive Mark Zuckerberg apologized to those attending a U.S. Senate hearing in January for “the types of things that your families have had to suffer†because of social media harms.
(Jose Luis Magana / Associated Press)

Federal legislation has been slow to materialize. A bipartisan bill to limit algorithm-derived feeds and keep children under 13 off social media was introduced in May, but Congress has done little to meaningfully rein in tech platforms — despite Meta’s chief executive, Mark Zuckerberg, apologizing in a U.S. Senate hearing in January for “the types of things that your families have had to suffer†because of social media harms.

It remains unclear what kinds of regulation the courts will permit. NetChoice has argued that many proposed social media regulations amount to the government dictating how privately owned firms set their editorial rules, in violation of the 1st Amendment. The industry also leans on a federal law that shields tech companies from liability for harmful content produced by a third party.

“We’re hoping lawmakers will realize that as much as you may want to, you can’t end-around the Constitution,†said Szabo, the NetChoice attorney. “The government is not a substitute for parents.â€

Advertisement

Skinner tried and failed last year to pass legislation holding tech companies accountable for targeting children with harmful content. This year’s measure, which was overwhelmingly passed by the state Senate and is pending in the Assembly, would bar tech companies from sending social media notifications to children between midnight and 6 a.m. every day, and 8 a.m. to 3 p.m. on school days. Senate Bill 976 also calls for platforms to require minors to obtain parental consent to use their core offerings, and would limit their use to an hour to 90 minutes a day by default.

Lawmakers ditched a bill that would hold social media companies liable for promoting harmful content but advanced another focused more narrowly on child sexual abuse material.

“If the private sector is not willing to modify their product in a way that makes it safe for Californians, then we have to require them to,†Skinner said, adding that parts of her proposal are standard practice in the European Union.

“Social media has already accommodated users in many parts of the world, but not the U.S.,†she said. “They can do it. They’ve chosen not to.â€

Wicks, meanwhile, said she considers her data bill to be about consumer protection, not speech. Assembly Bill 1949 would close a loophole in the California Electronic Communications Privacy Act to prevent social media platforms from collecting and sharing information on anyone under 18 unless they opt in. The Assembly approved Wicks’ measure without dissent, sending it to the state Senate for consideration.

Costello suggested that focusing the proposals more narrowly might give them a better chance of surviving court challenges. She is part of an effort coordinated by Harvard’s T.H. Chan School of Public Health to write model legislation that would require third-party assessments of the risks posed by the algorithms used by social media apps.

“It means that we’re not restricting content, we’re measuring harms,†Costello said. Once the harms are documented, the results would be publicly available and could lead state attorneys general to take legal action. Government agencies adopted a similar approach against tobacco companies in the 1990s, suing for deceptive advertising or business practices.

Advertisement

Szabo said NetChoice has worked with states to enact what he called “constitutional and commonsense laws,†citing measures in Virginia and Florida that would mandate digital education in school. “There is a role for government,†Szabo said. (The Florida measure failed.)

But with little momentum on actual regulation at the national level, state legislators continue to try to fill the vacuum. New York recently passed legislation similar to Skinner’s, which the state senator said was an encouraging sign.

Will NetChoice race for an injunction in New York? “We are having lots of conversations about it,†Szabo said.

This article was produced by KFF Health News, a national newsroom that produces in-depth journalism about health issues.

Advertisement