Peering into the Future: Kids, Games, and Regulation

Peering into the Future: Kids, Games, and Regulation

April 10, 2026

Peering into the Future: Kids, Games, and Regulation

By: Steven Hess

This March, a California jury found Google and Meta (which operates Facebook, Instagram, and other websites) liable for negligently causing a child user’s depression and anxiety.   Driven by a desire to protect children from perceived vices,[1] this is likely to be the first in a wave of new lawsuits and legislation aimed at regulating online spaces.  Although limited to protecting children for now, these efforts show that there is a societal desire for online platforms to affirmatively protect consumers from real or perceived dangers.

Online Platforms under Fire

Across the world there is increased pressure on internet platforms that are perceived as harmful.  The European Union has already fined Google billions for its advertising practices.  But a growing trend in the global response to online regulation has been to emphasize the impact on children.

At the end of 2025, for example, Australia banned social media for children under the age of 16.  More than a dozen other countries are following suit.  Similarly, in the United States, there have been significant efforts to ban smartphones in schools, a broader precaution fueled largely by the distracting lure of social media.

The lawsuit against Meta and Google similarly relied on arguments that children are deserving of special protection in online spaces .   The crux of the plaintiff’s argument was that Meta and Google “know children are in a developmental stage that leaves them particularly vulnerable to the addictive effects of these features” and that these companies designed the platforms to “target [children] anyway, in pursuit of additional profit.”[2]

The $6 million verdict is ultimately a drop in the bucket for Google and Meta.[3]  But this case opens the door to argue that companies must consider the fundamental design of their online platforms, even if the content itself remains protected by Section 230 immunity.  This duty is more pronounced when the website involves potential issues of addiction and use by children.

From Social Media to Video Games: Regulators Expand their Reach

The State of New York is among the most aggressive in regulating online spaces for children.  Since 2023, the State has been involved in its own lawsuit, claiming that Meta uses unfair and deceptive practices to take advantage of young people.[4]

New York Governor Hochul has announced that she intends to aggressively step up the State’s protection of children online.  In addition to criticizing AI “companions” and social media companies, she also took aim at online gaming.  She decried that children “are being coerced to gamble on virtual junk.”  New York Attorney General Letitia James also criticized “online platforms like Roblox.”  Roblox, a massive online gaming platform specifically aimed at children, is facing over a hundred lawsuits, and the company’s CEO recently discussed adding a prediction market feature, similar to Polymarket, onto the platform.

One month after these speeches, the State of New York issued a complaint against Valve, a computer-game producer and distributor. Just as in the complaint against Meta and Facebook, New York argues that Valve takes advantage of children and adolescents.  The State argues that Valve’s use of “loot boxes” in its games “pose the same risks as traditional casino-style gambling.”[5]  Although Valve sells loot boxes to players of all ages, New York emphasized that the games are advertised to children, and that Valve does not adequately enforce its age-verification requirements.  Days after New York filed this lawsuit, another class-action complaint was filed against Valve making virtually identical allegations.

Taken collectively, these cases signal a litigation movement to protect consumers playing online games.  As with social media companies, regulators may look to require online gaming companies to take more affirmative steps to protect their consumers, and politicians may supplant regulation-by-litigation with new, proscriptive legislation.

A Glimpse at What’s Coming

On March 18, U.S. Senator Blackburn unveiled her legislative framework for regulating AI.  Although the proposed legislation focuses on artificial intelligence, it also includes a fascinating glimpse into the possible future of online regulation.

Senator Blackburn’s “Kids Online Safety” provision, would impose new regulations on all gaming companies which involve “microtransactions.”[6]  That term is defined as “a purchase made in an online video game,” including purchases made with a virtual currency or a subscription.[7]  Microtransactions are thus defined quite broadly, and range from in-game outfits and cosmetics, to experience boosters, and potentially even in-game currencies.

Under the bill, any online video game which includes microtransactions would have an affirmative duty of care to protect children on the platform.[8]  That duty would require online platforms to design their platforms to protect children from a wide range of potential harms, including eating disorders, depression, and “[p]atterns of use that indicate compulsive usage.”[9]

If passed into law, this bill would impose enormous regulatory requirements.  Companies of all sizes would have to consider whether “any design feature” could foreseeably cause a broad array of harms.  Moreover, nothing in the bill appears to limit the design features at issue to be tied to the game’s microtransactions.  Therefore, it would create a duty for developers to consider how a chat feature impacts the sale of narcotics to children, solely because that developer sells in-game cosmetic upgrades.

Senator Blackburn’s bill is not likely to pass.  However, it shows a growing appetite to regulate gaming companies and parallels the judgments against Google and Meta.  That court permitted lawsuits to challenge the design of online platforms and whether they unfairly target children and vulnerable groups.  The Kids Online Safety provisions would similarly require many video game operators to actively consider and modify their platforms to limit “compulsive” behavior.

Conclusion

The internet is fully integrated into daily life.  Unsurprisingly, there is now a movement to regulate online spaces to draw starker lines between what is considered effective marketing and what is considered dangerous.  For now, Section 230 of the Telecommunications Decency Act protects most online platforms from being liable for material considered to be “content.”  But, increasingly, the features of online spaces are themselves apt to become the target of regulation.

While some of those measures, such as the Kids Online Safety provisions are unlikely to become law soon, they nevertheless foretell a future where online platforms have an affirmative duty to protect children.  Only time will tell whether such duties will be extended to all users, children and adults alike.

 

 

[1] My colleague Lauren Scribner has written about a growing view that social media is like a vice.

[2] K.G.M. v. Meta Platforms, Inc., No. 23SMCV03371, 2023 WL 12184500, ¶ 2.

[3] In a similar lawsuit, a New Mexico jury found Meta had mislead consumers regarding the safety of its platforms, and had endangered children.

[4] See generally In re Social Media Adolescent Addiction/Personal Injury Products Liability Litig., No. 4: 23CV05448, 2023 WL 7002550.

[5] State v. Valve Corp., Complaint at ¶ 120.

[6] Kids Online Safety, § 411(7), available at https://www.blackburn.senate.gov/services/files/15AAEA28-5403-480D-8720-5E4C2D6F2A9A.

[7] Id. at (7)(A).  Notably, the bill excludes from the definition of microtransactions purchases made with virtual currency “earned through gameplay and [that] is not otherwise purchasable or redeemable using cash or credit or included as part of a paid subscription.”  Id. at (C)(i).

[8] Id. at (11)(B); § 412.

[9] Id. at § 412(a).

Steven Hess

Steven Hess

Steven Hess brings a unique blend of economic insight and legal expertise to his practice, providing him with a keen understanding of not just the legal frameworks that govern markets, but also the economic forces that shape them.

Barney Frank’s Advice to Poker Players After ‘Black Friday’
Ifrah on iGaming |
May 9, 2011

Barney Frank’s Advice to Poker Players After ‘Black Friday’

By: Ifrah Law
What’s Next for Online Poker Players?
Ifrah on iGaming |
May 5, 2011

What’s Next for Online Poker Players?

By: Ifrah Law
The Great Race: Legal Online Poker Advances in Nevada, Hawaii Legislatures
Ifrah on iGaming |
Mar 24, 2011

The Great Race: Legal Online Poker Advances in Nevada, Hawaii Legislatures

By: Ifrah Law
Online Gaming on Path to Legalization in New Jersey
Ifrah on iGaming |
Jan 13, 2011

Online Gaming on Path to Legalization in New Jersey

By: Ifrah Law

Subscribe to Ifrah Law’s Insights