Meta and YouTube found negligent in landmark social media addiction case (nytimes.com)

by mrjaeger 523 comments 501 points
Read article View on HN

523 comments

[−] krunck 52d ago
[−] strongpigeon 52d ago
There is a fairly low amount of details about the case in the article. This NPR article [0] has a bit more, but it's still fairly sparse. Though it's interesting how Zuckerberg thought it was a good idea to say: "If people feel like they're not having a good experience, why would they keep using the product?".

Given that this is a case about addiction, that feels like a shockingly bad thing to say in defense of your product. Can you imagine saying the same thing about oxycodone or cigarettes?

[0] https://www.npr.org/2026/03/25/nx-s1-5746125/meta-youtube-so...

[−] twoodfin 51d ago
As someone who values a liberal society, I hope we’d be exceedingly careful in what we label “addictive” in the same bucket as oxy or nicotine.

I also hope the reasons are obvious.

[−] nkrisc 51d ago
Keep in mind that this case is about about a minor, not an adult. I don't think it's fair to ask children to resist social media through sheer willpower when there are legions of highly educated adults on the other side trying to increase engagement.

It should be no surprise that children can be manipulated by highly intelligent adults.

[−] Barbing 51d ago

>[There are] legions of highly educated adults [at Meta] trying to increase [child] engagement

Why is this not only OK but the best way for Mark to spend every waking moment of his life?

Money thing? But often would he think about his bank account versus his products, maybe it’s pure drive?

[−] randycupertino 51d ago
I just wish for once one of these egomaniacal billionaires would actually put all their efforts and resources into solving climate change or ending world hunger.

Even his medical initiative Chan-Zuckerberg biohub is a self-congratulatory shell game. I worked in the same building as them for years, literally all they did was have parties, conferences, networking events and self-congratulatory schmooze things and never prioritized actual lab research or clinical advancements.

[−] nradov 51d ago
Be careful what you wish for. Serious, persistent world hunger in certain countries primarily exists not due to climate change or lack of food or even lack of money but because of local violence and corruption. For example, the notorious 1980s famine in Ethiopia attracted much attention in developed countries and many people (including billionaires) donated money to help. There was a drought which made farming difficult but the main problem was a violent civil war. Armed groups used food as a weapon against the civilian population by destroying crops and stealing foreign aid.

So, if you want "these egomaniacal billionaires" to end world hunger then you're effectively asking them to form private militias and impose peace by force in the developing world. The new colonialism. Is that what you want?

[−] hansvm 51d ago
Incentives drive outcomes, do they not? It's too easy to become a billionaire as a charlatan and too hard as somebody able to make a difference. Rather than granting Zuck nearly unlimited power and politely asking him to use it wisely if he doesn't overly mind the inconvenience, why not create a world where monsters can't have that much power in the first place?
[−] jarjoura 51d ago
I just find it ridiculous that we as a society have allowed CEOs to become that wealthy. It's one thing to make your money from lucky investments, and become a billionaire. It's another to get there by simply running a corporation.
[−] keybored 51d ago

> I just wish for once one of these egomaniacal billionaires would actually put all their efforts and resources into solving climate change or ending world hunger.

I don’t. That presupposes that they have anything to contribute to begin with.

Their wealth beyond some millions (edit: being generous) is built on exploitation. That’s not necessarily a transferrable... skill.

[−] lokar 51d ago
And not just a minor, AIUI it's important that at the start, she was under 16
[−] timmg 51d ago

> Keep in mind that this case is about about a minor, not an adult.

This obviously means that tech is going to have no choice but to do "age verification". And I don't think there's much of a way to do that that wouldn't be uncomfortable for a lot of us.

[−] hnlmorg 51d ago
We already have a distinction because it’s been known for decades already that some things are addictive purely through reinforcement psychology and some things lock people into a chemical dependence.

For example see the glossary in https://en.wikipedia.org/wiki/Substance_dependence

[−] janalsncm 51d ago

> I also hope the reasons are obvious.

Based on the fact that many people here disagree about fundamental things, as well as the fact that “liberal” is a highly overloaded term, I think it should be obvious that it’s not obvious what you mean.

[−] anon84873628 51d ago
I don't think the reasons are obvious. Where do you put gambling on the spectrum?
[−] Zigurd 51d ago
Dark patterns are real. Deceptive advertising is real. So-called prediction markets amount to unregulated gambling on any proposition. Many online businesses are whale hunts and the whales are often addicts.
[−] array_key_first 51d ago
Specifically when it comes to children, we need to show more restraint in giving them the liberty to partake in potentially addictive substances.

It's one thing if an adult smokes and gambles, it's another thing if a child does. It seems to me that stuff you do in youth tends to stick around for life.

[−] Sir_Twist 51d ago
I feel like people use the word “addiction” to refer to both chemical addiction and behavioral addiction, and that people understand that the latter is (usually) far less serious than the former.
[−] bsder 51d ago

> As someone who values a liberal society, I hope we’d be exceedingly careful in what we label “addictive” in the same bucket as oxy or nicotine.

The problem is that this runs directly into the evidence that is mounting from GLP-1 agonists.

A lot more things are tied to the pathways we associate with "addiction" than we thought.

[−] joecool1029 51d ago

> I hope we’d be exceedingly careful in what we label “addictive” in the same bucket as oxy or nicotine.

Not careful enough apparently: Nicotine isn't that addictive on its own, tobacco is.

[−] zombar 51d ago
What's obvious to me likely isn't obvious to you or anyone else, therefore nothing is obvious.

I wish we'd delete that word from the English language.

[−] JKCalhoun 51d ago
"I hope we’d be exceedingly careful in what we label “addictive”…"

To be sure. But still an obviously dumb thing for a CEO to say though.

[−] Valakas_ 47d ago
As someone who values a conservative society, I hope we'd be exceedingly careful before releasing products to consumers before knowing whether they're addictive or not.
[−] recursive 51d ago
Prohibition? Despite your hopes I'm not sure I got your intent.
[−] cyanydeez 51d ago
Social media is addictive the same way anorexia is. If you think Anorexia isn't a form of addiction, then sure, you got your 'safety'.
[−] hsuduebc2 51d ago
What wording would you use then if the definition fit? You can use minor addiction or severe addiction but it's still an one.
[−] keybored 51d ago
Why is it that these philosophical ideas about supposed personal freedom again and again make an appearance when it’s about the freedom of corporations? It’s always that. Either that or with the Free User pushed infront of them like a shield.
[−] btmiller 51d ago
There’s a big distance between libertarian and liberal societies. The libertarian tendencies of corporations are what tend to cause more harm.
[−] yabutlivnWoods 51d ago
Mmhmm those are words. Words that are hand wavy pretexts for conservatism rather than liberalism; as a lover of liberal society you hope it acts conservatively!

This just comes off as poorly obfuscated self selection. You own a bunch of Meta, Alphabet and other media stocks?

[−] hash872 51d ago
At least even money that an appellate court throws this verdict out entirely. Reminder that the US is the only developed country that uses juries for civil trials- everywhere else, complex issues of business litigation are generally left to a panel of judges. It's not that hard to rile up a bunch of randomly impaneled jurors against Big Bad Corporation. The US is kind of infamous for its very large, very unpredictable civil verdicts. There's an incredibly long history of juries racking up shockingly large verdicts against companies, only for an appellate court to throw the whole case out as unreasonable. Not even close to the final word in the American judicial system.

Edit to include: I mean this is coming the same day as the Supreme Court throwing out the piracy case against Cox Communications 9-0. Remember that this case originated with $1 billion dollar jury verdict against them! Was reversed by an appeals court 5 years later and completely invalidated today. Juries should not handle complex civil litigation, I'm sorry

[−] bogdanoff_2 51d ago
The solution to this would be a law forcing these sites to allow third-party suggestion algorithms, so that you can choose who and how content is being suggested to you.

It could be perhaps as simple as allowing third-party websites and apps for watching Youtube on your phone. And it's okay if this would be a premium paid feature, so there's no counter argument that "it costs them money to host videos".

This is not an entirely new idea either. Before Spotify became popular, people would integrate Last.FM into their media players to get music recommendation based on their listening history, and you could listen to music via YouTube directly on the last.fm website.

[−] fraywing 52d ago
I'd hope the next iteration of social media tools humanity builds are less about reinforcing the individual ego and more about collective improvement, learning, and supporting the health of our species.

Anecdote, but it does seem like a lot of younger folks I speak with are exhausted by the dark patterns and dopamine extraction that top-k social media platforms create.

If agents/AI/bots inadvertently destroy the current incarnation of social media through noise, I think we'll be better for it.

[−] onlyrealcuzzo 51d ago
How is any app/website that 1) appeals to kids, 2) sells attention, 3) does A/B testing and/or has a self-learning distribution algorithm NOT guilty of this?
[−] pow_ext 51d ago
Apps like instagram and YouTube should be required at least to give an option to disable reels and shorts
[−] xvxvx 51d ago
In before someone says ‘blame the parents’ and not the multi-billion dollar companies who’ve spent decades targeting children for lifelong addiction, ignoring the negative effects on their mental health.
[−] dzink 51d ago
Read the book “Careless People” if you have a chance - according to the book, social media companies figured out they have real leverage with politicians since they can influence elections. As a result they are actively pushing for far right candidates to reduce their own taxation and regulation.
[−] ApolloFortyNine 51d ago
This just seems ripe for selective enforcement if not codified in law. I agree the algorithm they use can be addicting, but it's because it's simply good at providing content the user wants to consume.

Besides a general 'don't be too good' I'm really not sure what companies should do about it. It just seems like it'll lead to some judges allowing rulings against companies they don't like.

Television's goal was always viewer retention as well, they were just never able to target as well as you can on the internet.

[−] jimmyjazz14 51d ago
Coming from someone who hate social media (and has kids) this might seems like a good thing on the surface, but I worry it will be another case used to allow the government to limit speech on the internet for adults.
[−] jmyeet 51d ago
I believe social media is on a collision course with an iceberg called Section 230.

Broadly speaking, Section 230 differentiates between publishers and platforms. A platform is like Geocities (back in the day) where the platform provider isn't liable for the content as long as they staisfy certain requirements about havaing processes for taking down content when required. A bit like the Cox decision today, you're broadly not responsible for the actions of people using your service unless your service is explicitly designed for such things.

A publisher (in the Section 230 sense) is like any media outlet. The publisher is liable for their content but they can say what they want, basically. It's why publishers tend to have strict processes around not making defamatory or false statements, etc.

I believe that any site that uses an algorithmic news feed is, legally speaking, a publisher acting like a platform.

Example: let's just say that you, as Twitter, FB, IG or Youtube were suddenly pro-Russian in the Ukraine conflict. You change your algorithm to surface and distribute pro-Russian content and suppress pro-Ukraine content. Or you're pro-Ukrainian and you do the reverse.

How is this different from being a publisher? IMHO it isn't. You've designed your algorithm knowingly to produce a certain result.

I believe that all these platforms will end up being treated like publishers for this reason.

So, with today's ruling about platforms creating addiction, (IMHO) it's no different to surfacing content. You are choosing content to produce a certain outcome. Intentionally getting someone addicted is funtionally no different to changing their views on something.

I actually blame Google for all this because they very successfully sold the idea that "the algorithm" ranks search results like it's some neutral black box but every behavior by an algorithm represents a choice made by humans who created that algorithm.

[−] absoluteunit1 51d ago
Oh man if they think YouTube and Instagram are addicting they should see what Roblox does lol
[−] paulkon 51d ago
Just needs a health warning label, like on alcohol or cigarettes. Then onto the high sugar products, and a quarter of the grocery store
[−] ChrisArchitect 52d ago
Notably a different case from the other one in New Mexico:

Jury finds Meta liable in case over child sexual exploitation on its platforms

https://news.ycombinator.com/item?id=47509984

[−] ramesh31 51d ago
I've heard about "landmark" cases against these companies over and over again for the last decade. There seems to be at least one every couple of years. And yet literally nothing has ever happened or changed.
[−] woah 51d ago
Are there any takeaways here for builders of social media applications who are not Facebook or Google? Is this a warning to not make your newsfeed algorithm "too engaging" or is it only really relevant for big companies?
[−] mikece 52d ago
A good time to (re-)recommend the movie "The Social Dilemma".
[−] freshtake 51d ago
Short form video is a different beast altogether, and much more concerning. The fact that these platforms don't offer a way to avoid short form altogether is a big issue.

YouTube allows you to "show fewer shorts" but what if you don't want them popping up at all?

AI Slop is the best thing to happen to these platforms - because it will lower trust and engagement as people (hopefully) become tired of inauthenticity. Rage bait is potent when the event in the video _actually_ happened, but when you realize it was AI generated, the manipulation feels even more obvious (though it was always there).

These platforms should also allow users to understand how the algorithm has categorized them, and be able to configure it. YouTube, Instagram, et al. would be safer places for viewers if they allowed users to tell them what they want to be exposed to, and what they don't. Big tech is dodgy about this currently, because the more control the user has the lower the engagement (good for the user, bad for profit).

[−] nottorp 51d ago
Just kids? Not adults?
[−] robinanil 51d ago
I have a somewhat unusual vantage point on this.

I'm a former Google engineer, now running a children's mental health startup (Emora Health), and my toddler is already on YouTube Kids.

So this verdict hits on every axis for me.I wrote up my full take here [1], but the short version: I don't think the "Big Tobacco moment" framing that NYT is pushing actually holds up.

Litigation is negative reinforcement, and if you've ever tried telling a toddler "no" you know how well that works long-term.The families in this case absolutely deserve to be heard. The harm is real. But courts can only punish — they can't redesign a recommendation algorithm.

The change has to come from people who understand these systems building better ones.

Haidt has been saying for years what this verdict just confirmed. The evidence was never the bottleneck. The will to design differently was.

I will give you a simple experiment. Try blocking Blippi from YouTube Kids, man, it's crazy, even if you block the main Blippi and Moonbug channels. 100s of channels have Blippi content cross-posted. And it keeps popping up. I know it's easy to build a Blippi block feature using AI that blocks across channels.

Thats the kind of solutions we need. I know we have the tools. Just need intent and purpose

[1] https://www.emorahealth.com/clinical-insights/social-media-v...

[−] kogasa240p 51d ago
Great news but this will probably the catalyst for more "age verification" nonsense. These algorithms are bad for everyone, not just kids.
[−] pseingatl 51d ago
I think all important public policy decisions should be left to random personal injury courtrooms, as long as the PI lawyers collect their customary fee. It's silly to let a regulator or legislative body butt in and so prevent the PI lawyers from collecting their cash. So what if you have no say in the result?
[−] prepend 51d ago
I think this is going to end up with a huge chunk paid out to state health departments for the foreseeable future.

Kind of like how tobacco companies now pay out billions every year and its a major source of funding for states.

Hopefully this means more health services available. But it will just serve like an ongoing tax.

[−] _slih 51d ago
two verdicts in two days, $375m in new mexico and $6m in LA. meta's insurance company already got cleared of covering these claims. if even ten more states follow, meta is paying out of pocket at a scale that actually shows up on the balance sheet.
[−] dlcarrier 51d ago
This is the kind of stuff that is causing them to push for mandatory identity verification laws. If they are being held liable for the the desires of their users, they're being forced micromanage the affairs of their customers, which preclude anonymous usage.
[−] curious1008 51d ago
This is real. No matter how much I configure content controls on YouTube for my daughter, she scrolls past everything and ends up on brainrot videos — and then she can't stop. I've felt for a long time that this is by design.
[−] jraby3 51d ago
When I was a kid, tv commercials were heavily censored and the tv channel could and would be fined immediately if something inappropriate was shown.

How is it that these days social media can circumvent all these safeguards and then somehow blame the parents if a kid is watching something inappropriate on an app designed for kids (like YouTube kids)?

The issue is that politicians are beholden to social media companies because they can literally get them or their opponent elected. After reading Careless People, I was amazed at how leaders of so many countries wanted to meet Zuck because he wields so much power.

I really hope this ruling is the beginning of the end of the free rein they've had.

[−] dmix 51d ago

> During his first-ever appearance before a jury in February, Meta's chairman and chief executive, Mark Zuckerberg, relied on his company's longstanding policy of not allowing users under the age of 13 on any of its platforms.

> When presented with internal research and documents showing that Meta knew young children were in fact using its platforms, Zuckerberg said he "always wished" for faster progress to identify users under 13. He insisted the company had reached the "right place over time".

Soon there will be government IDs required to use social media sites because parent's can't take phones away from their kids.

[−] cat-turner 51d ago
I actually quit Instagram because I found it so addictive. Wild that there's a case. Parents need to just take away phones from children. Simple as that.
[−] aprilthird2021 51d ago
I can't help but feel these are "revenge" verdicts. Public perception of these companies is dirt low, and there are so few levers the average person has to change what they feel is an increase in atomization, loneliness, breakdown of civic discourse, Cambridge Analytica level political targeting, misinformation, etc.

Maybe the social media companies could do more to combat all these. They certainly have a level of profit compared to what they provide to the average person that makes people squirm.

But does anyone believe for a second that YouTube is responsible for a person's internet / video watching addiction? It's like saying cable television is responsible for people who binge watch TV.

It's hard to square this circle while sports gambling apps and Polymarket / Kalshi are tearing through the landscape right now with no real pushback

[−] eagsalazar2 51d ago
So... should we all sue Youtube and Meta now? This is a semi-serious, follow this precedent to its logical conclusion, question.
[−] nickvec 51d ago
Wow, so does this pave the way for massive class action lawsuits? Not familiar with how precedents like this play out long term.
[−] GardenLetter27 51d ago
Mandatory age verification is coming.
[−] homeonthemtn 51d ago
Just give people the option to turn off algos. "I do not consent to suggested content"
[−] rankdiff 51d ago
Why this site doesn't let me enter? Why temporarily restricted?

This stop-bot thing can be annoying at times.

[−] Ohkay 51d ago
How about optimize for engagement with people you know irl and not influencers and media?
[−] 2OEH8eoCRo0 51d ago
Huge if upheld. This was the bellwether case for thousands of other similar cases.
[−] akabul0us 51d ago

> which could expose the internet giants to further financial damages and force changes to their products This is so wildly untrue, it's either downright deception on the part of the NYT or they simply don't know how to do math. In this case the judgement was for $3M. To keep the numbers simple for the sake of this comparison, let's not get into the 70/30 split of this amount and just imagine that YouTube (Google) ((Alphabet)) had to pay the entire amount. Their revenue (again, keeping it simple, don't @ me) for 2025 was $350B. Humans don't typically conceptualize the difference between millions and billions very easily, so let's knock everything down a few orders of magnitude. At that rate they take in around a billion dollars a day. So imagine this as a person who takes in about a thousand dollars a day, and has a yearly salary of $350,000 (quite comfortable to live on, while not being obscene). Now apply the same math to the amount they have to pay, and what do you get? A grand total of $0.03. You'd have to do it with a bank transfer (or use a nickel and overpay considerably). I make considerably less than $350K/yr, and I can still confidently say that if I had to pay a fine of 3¢, that wouldn't just make me not take the fine and the court that issued it seriously, it would have the opposite effect it's intended to. I would now see that it costs me next to nothing to keep doing things the way I always have, whereas if I were to change things like the very nature of my business to avoid the fine, that could have a potential serious effect. What the court has done is demand that when Google and Meta spit on their customers, they also throw a few pennies in their change cup to show how sorry they are, while changing nothing and going full speed ahead.

[−] Handy-Man 51d ago
IMO, parents share just as much blame here, if not more. Giving your kids independence doesn't mean being oblivious to what they're doing online. Too many parents confuse hands-off parenting with not parenting at all.
[−] p5v 51d ago
I’ve been thinking about this a lot while building Murmel (https://murmel.social). One thing we wanted to avoid from day one was the “infinite engagement machine” model, so instead of pushing algorithmic slop, we just surface links that are already being shared by people you follow on Bluesky and Mastodon.

It ends up feeling much closer to “what’s interesting in my corner of the web right now?” and much less like a system trying to keep you trapped inside it.

Small scope, obviously, but I think more social tools should feel like utilities, not casinos.

[−] yacin 51d ago
this has to be the first of many right? fingers crossed this leads to some meaningful change.
[−] ferguess_k 51d ago
What a surprise! I guess they didn't pay enough protection money. Still, better than nothing.
[−] parsimo2010 51d ago
I don’t feel good about this case- on the one hand, I’m all for sticking it to big corporations. On the other hand, nobody has claimed that Meta and YouTube were doing anything illegal, so this case is different from civil suits brought after a criminal case finds someone guilty. This is a case where the jury decided they don’t like how two corporations acted, and are just giving money to one person. Why does this plaintiff in particular deserve this money?

I’ve argued in the past that the right way to create the change in corporations we want is to change the laws, and people have made valid points that Congress has basically given up on doing that. But even so, civil cases with fines don’t seem like that way to make lasting change. In the analogues to the tobacco fights, there are LAWS that regulate tobacco company behaviors as a result. The civil case here isn’t going to result in any law. So what are companies supposed to do? Tiptoe around some ill defined social boundary and hope they don’t get sued? Because apparently the defense of, “no I didn’t target that person and I didn’t break any laws” is still going to get you fined. What happens when a company from a conservative location gets sued in a liberal location for causing a social ill? Oh, we’re cool with that. But what if a company from a liberal location gets sued in a conservative location for the same thing? Oh, maybe we don’t like that as much. I’m taking the libertarian side here. I know plenty of people who don’t watch TV, don’t use Facebook, and I know plenty of people that recognized that they were spending too much time on digital platforms and decided to quit or cut back. So a healthy person can self regulate on these apps, I’ve seen it and done it. I’m just not sure how much responsibility Meta and YouTube bear in my mind. If they’re getting fined $3M plus some TBD punitive amount, are we saying that this 20 year old person lost out on earning that much money in their life or would need to spend $3M on therapy because of Meta or YouTube? It feels a little steep off a fine for one person.

If Meta and YouTube really were/are making addictive products, wouldn’t a lot more people be harmed? Shouldn’t this be a class action suit where anyone with mental trauma or depression be included?

I don’t know the details of the case, but I highly doubt that this one plaintiff was targeted specifically, and I doubt their case is that unique. I read tons of news articles about cyber bullying, depression, suicide attempts, and tech addiction. Does every one get to sue Meta and YouTube for $3M now?

[−] nlarion 51d ago
There is no personal responsibility left in America. I have a child. It's my job to teach him and watch what he watches and does. I guess I am the only one who thinks this way. Good luck having the parental government raise your child. Parody: I let my child have cocaine and now they're addicted!!!!! Hilarious.