Lawmakers aren’t wrong to worry about kids and social media — they’re just dead wrong about how to fix it.
The latest example is Connecticut, which is poised to enact a law allegedly aimed at protecting minors on social media, joining a growing trend among other states — like Ohio, Arkansas and Utah — pushing for sweeping online regulations. However, if passed, the law risks landing in federal court over constitutional concerns.
On May 14, the Connecticut House passed H.B. 6857 — a bill that would make it illegal for social media platforms to display algorithmic content to minors unless they first verify the user’s age and obtain “verifiable parental consent.” In other words, any app using personalized feeds — like infinite scroll, suggested videos, or targeted posts — would be forced to screen every user or risk violating state law.
The bill also defaults minors to one-hour daily limits, blocks notifications after 9 p.m., and restricts interactions so that only approved contacts can view or respond to their content. Platforms must allow parents to override these limits. It passed the House 121–26 and now awaits action in the Senate.
Supporters, including Connecticut Attorney General William Tong, say it’s necessary to fight youth “addiction” to platforms like TikTok, Meta, and Snapchat. AG Tong even likened the proposal to past fights against Big Tobacco and the opioid epidemic — suggesting that endless scrolling is the next fentanyl. But setting aside the melodrama, there’s a glaring problem with this policy and it's called the Constitution.
However, federal courts are already swatting down similar laws.
In April, a U.S. district judge in Ohio blocked the state’s Social Media Parental Notification Act, ruling that it infringed both parental rights and minors’ First Amendment freedoms.
The decision came just weeks after another federal judge in Arkansas permanently struck down that state’s Social Media Safety Act — which required age verification and parental consent — on the grounds that it was not narrowly tailored and lacked a compelling government interest.
The ruling was simple, “[T]here is no evidence that the Act will be effective in achieving [its] goal.”
Utah also tried a similar approach. In 2023, the state passed a pair of laws requiring social media platforms to verify users’ ages, obtain parental consent for minors, and impose time-based restrictions on access.
But after tech industry group NetChoice filed suit, a federal judge blocked the laws, finding they likely violated the First Amendment. Despite a legislative rewrite in 2024, Utah’s revised laws were again put on hold. U.S. District Judge Robert Shelby ruled in September that the regulations were not narrowly tailored and imposed content-based restrictions on speech — a constitutional red flag.
The message from the courts is clear, and that is states can’t violate civil liberties in the name of protecting children, no matter how well-intentioned the law may be.
That should matter to legislators. But in Connecticut — as in many states — the urge to “do something” often overrides constitutional caution. Ironically, this legislation is being pushed in a state that already passed a sweeping data privacy law just last year. That law, which is still in effect, bans targeted ads to minors, prohibits the sale of their data without consent, limits geolocation tracking, and even requires platforms to exercise “reasonable care” to shield young users from harm.
Now lawmakers want to layer another set of vague, legally risky mandates. If passed, the Connecticut bill would apply to any platform with users in the state — meaning national companies could face costly compliance requirements or lawsuits under the state’s Unfair Trade Practices Act, just for showing kids a recommended post.
There is also the question of resources. Connecticut’s own attorney general has publicly said his office is under-resourced. So, who exactly is going to monitor compliance, enforce reporting requirements, and litigate inevitable First Amendment challenges? Apparently, the same office admitting it’s already stretched thin.
There are better tools for families — ones that don’t require deputizing the government as everyone’s digital babysitter. App-store level parental controls already exist. Devices can be set to limit access, and many platforms offer family management features. Empowering parents doesn’t require disempowering everyone else.
Bills like this aren’t about safety. They’re about signaling. Lawmakers want to appear “tough on tech” — even if it means trampling constitutional rights and bogging down courts in litigation that will almost certainly end in defeat.
For now, the bill sits on the Connecticut Senate calendar, waiting for a vote. If passed, it won’t just regulate platforms. It will test the boundaries of what the government can dictate in the name of protecting children.
Let’s hope the courts keep doing what the legislature refuses to: draw a line.
Meghan Portfolio is the Manager of Research and Analysis at Yankee Institute.
Read Full Article »