Bill would require tech companies to design social media platforms to protect youths

PROVIDENCE – Senate Majority Leader Valarie J. Lawson, D-East Providence, and Rep. Megan L. Cotter, D-Exeter, have introduced legislation to require social media platforms, apps, and other online services and products that are likely to be used by children “to design their products to protect them,” according to a legislative news release.

Roughly 97% of teenagers use the internet every day, according to a 2022 Pew Research Center study focused on teens, social media and technology referenced in the release.

Primary Care and RI Healthcare Crisis: South County Health is Working to Address, But Rhode Island Needs Systemic Solutions

Rhode Island’s healthcare system is at a breaking point, affecting patients, providers, and hospitals statewide…

Learn More

According to the National Center of Biotechnology Information, a division of the National Library of Medicine at the National Institutes of Health, nearly 95% of teenagers reported using social media platforms, and more than one-third of them said they use social media “almost constantly,” another reference mentioned in the legislative news release.

A third reference listed in the release relates to an April 2024 article from the American Psychological Association that was based on the results of a Gallup survey conducted from June 26, 2023, to July 17, 2023, with responses by 6,643 parents living with children between ages 3 and 19, and 1,591 teens living with those parents, which found that 41% of teen respondents with the highest social media use rated their mental health as “poor or very poor,” compared with 23% of those with the lowest use.

- Advertisement -

The legislation provisions would apply to technology companies operating in Rhode Island that collect and control personal data and either gross more than $25 million annually; receive, buy or sell data from a total of at least 50,000 individuals, households or devices annually; or derive at least 50% of their annual revenues from selling individuals’ personal data.

The bill would require big-tech companies to use design features to prevent unfair or deceptive treatment, unlawful disparate impact, financial or reputational injury, discrimination against, or offensive intrusions into the private affairs of children. The bill would also discourage features that are designed to increase, sustain or extend use of the product by children in a way that might result in certain harms.

“Right now, the burden is on parents to protect kids from online harms. This bill shifts some of that responsibility to the platforms that profit from children’s engagement,” Cotter said. “Unless we pass laws requiring change, big tech will continue to put profits over protections, leaving children vulnerable to harmful content, data exploitation and addictive design practices.”

Lawson said that while the bill “is not a panacea,” it establishes “basic guardrails to protect kids from the most egregious threats to their safety, such as location tracking, having their personal information sold, or being profiled for commercial purposes.”

Companies subject to the requirements would have to determine whether they comply with the requirements to protect children from heightened risks “and create plans to ensure compliance.”

“The goal is to prevent tech companies from exploiting children’s vulnerability for profit or enabling others to do so,” Lawson said.

Violations could result in fines of up to $7,500 per affected child. If approved, the legislation would take effect on Jan. 1, 2026.

Christopher Allen is a PBN staff writer. You may contact him at Allen@PBN.com.