Jury finds Meta liable in case over child sexual exploitation on its platforms (cnn.com)

by billfor 530 comments 490 points
Read article View on HN

530 comments

[−] Aurornis 52d ago
Many will cheer for any case that hurts Meta without reading the details, but we should be aware that these cases are one of the key reasons why companies are backtracking from features like end-to-end encryption:

> The New Mexico case also raised concerns that allowing teens to use end-to-end encryption on Instagram chats — a privacy measure that blocks anyone other than sender and receiver from viewing a conversation — could make it harder for law enforcement to catch predators. Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

The New York case has explicitly gone after their support of end-to-end encryption as a target: https://www.reuters.com/legal/government/meta-executive-warn...

[−] dwedge 52d ago
Maybe I'm just getting old and cynical but, while I think current social media is bad for children, I'm very suspicious of the current international agreement that it's time to take action, especially with all the ID verification coming from multiple avenues
[−] sharkjacobs 52d ago

> The New Mexico attorney general’s office created multiple fake Facebook and Instagram profiles posing as children as part of its investigation into Meta. Those test accounts encountered sexually suggestive content and requests to share pornographic content, the suit alleges.

> The fake child accounts were allegedly contacted and solicited for sex by the three New Mexico adult men who were arrested in May of 2024. Two of the three men were arrested at a motel, where they allegedly believed they would be meeting up with a 12-year-old girl, based on their conversations with the decoy accounts.

and

> “The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls,” Bejar said.

This is what it's about right? The article doesn't make it seem like encryption is meaningfully part of this case at all.

> Midway through trial, Meta said it would stop supporting end-to-end-encrypted messaging on Instagram later this year.

There's no indication that that decision, or the announcement, are directly related to the trial, just they just happened at the same time? It's a link drawn by CNN, without presenting any clear connection

[−] nclin_ 51d ago
375 million awarded at $5000 per child harmed. Implying that only 75,000 children were harmed.

Got away with it again, good profit, will repeat.

[−] bradley13 52d ago
We don't want age verification, and we do want E2E encryption. Yet, because Meta is an evil company, we cheer on this judgement.

Reality, folks: you can't have both.

[−] zeeshana07x 52d ago
Fines like this only work if they're large enough to change behavior. $375M for a company Meta's size is more of an accounting entry than a deterrent.
[−] fny 52d ago
This fine from New Mexico is about 0.6% of Meta's annual profit.

If all 50 states sue at the same rate, that'll be a 30% dent, and I'm sure states can sue for more than 0.6% too. That would be historic action against malfeasance and would send a strong FAFO single to all corporates.

Let's lobby for it.

[−] exabrial 52d ago
That fine is missing a few zeros on the right side
[−] sarbanharble 52d ago
It takes 7 clicks to turn off ads that promote eating disorders. Thats enough proof.
[−] lunias 51d ago
Social media for children should be moderated by their parents, full stop. End-to-end encryption exists. You cannot un-invent it. It is trivial to roll your own encrypted chat service.
[−] WarcrimeActual 52d ago
I haven't read this article, but I can tell you for certain that no verdict was handed down that will punish them in any way that matters. They have and generate more money than they could ever spend and they're functionally above the law because of the money and lawyers they can afford. The law itself is broken in this country and when you get big enough you can literally get away with murder.
[−] tombert 52d ago
They had to pay about $375 million. That's a lot of money, but I suspect that Facebook has made considerably more than that on targeting children.

I'm hardly the first person to use this logic, but if they make more money breaking the law than they have to pay in fines, then it's not a fine, it's a business expense.

[−] intended 52d ago
This particular verdict is a long time coming. How it drives meaningful change is the bigger question.

One of the challenges we need to resolve is the race to the bottom for online communities - engagement metrics will always result in a PH level that supports more acerbic behavior.

There’s multiple analyses that you can find, if not your own experience, to believe that we should be able to do better with our information commons.

Just today, I found a paper that studied a corpus of Twitter discussions and found that bad-faith interactions constituted 68.3% of all replies (Twitter data).

The engineer and analyst side of us will always question these types of analyses.

I’ve read enough papers at this point for the methods to matter more than the conclusion.

1) meta, and the other tech platforms need to open up their research and data. NDAs and business incentives prevent us from having the boring technical conversations.

2) tech needs someone else to be the bogeyman - the way we did for tobacco. The profit incentive ensures profitable predatory features pass review. Expecting firms to ignore quarterly shareholder reviews for warm fuzzies is … setting ourselves up for failure.

Regulators (with teeth) need to be propped up so that the right amount of predictable friction (liability) is introduced.

3) tech firms need an opportunity or forum to come clean. The sheer gap between the practical reality of something like content moderation vs the ignorance of users and regulators - results in surprise and outrage when people find out how the sausage is made.

4) algorithm defaults decide the median experience for participants in our shred market place of ideas. The defaults need to be set in a manner that works for humans and society (whatever that might be).

Economies are systems to align incentives to achieve subjective goals.

[−] fragmede 52d ago
So... end to end message encryption means meta can't see messages child molesters are sending to children.
[−] CrzyLngPwd 51d ago
The fine is just one of the costs of doing business for these megacorps.
[−] ourmandave 52d ago
Do we have to wait for any appeals before the performative mail out settlement checks for $1 routine?
[−] HardwareLust 52d ago
$375M isn't even a slap on the wrist for a company that raked in $60B last year.
[−] deepsun 52d ago
I cheer any decision that holds any private web property (like Facebook) accountable for it's user actions.

It helps to reduce hegemony of large social platforms and promotes privately owned websites. For example, I know everyone who has permissions to post on my website (or pre-moderate strangers comments), and is ready to take responsibility for their posts, what my website publishes.

Currently the legal stance seems strange to me -- large media platforms are allowed to store, distribute, rank and sell strangers data, while at the same time they claim they are not responsible for it.

[−] spenvo 51d ago
Meta's own research (and its use of it) has shown that it repeatedly ignores well-substantiated facts about the harms of its products. Now that Section 230 seems like a flawed shield, I fear the takeaway for other companies will be: never conduct honest research in the first place to preserve plausible deniability.

Meta has always wanted the appearance of caring about safety (helps them attract talent and keep mission-related morale high), while nearly always prioritizing growth (save for tiny blips of time, like in 2017 when the fallout of the cambridge analytica stuff was hitting a crescendo), whereas companies like X are run by people explicitly disinterested in putting significant resources into safety, especially research.

I will also add that, for the past few years, Meta and X both have become extremely hostile to external researchers of their platforms, shutting down access to tools and data.

[−] cedws 52d ago
Wasn't Zuckerberg caught red handed in emails signing off on this? When is he going to be facing consequences?
[−] throw7 52d ago
If Meta did advertise the "safety of its platforms for young users" then they should be held accountable for that. It seems clear from the whistleblowers that Meta had internal data that they knew they were not safe for young users, but Zuck gotta get those ads($$$) in front of young kids.
[−] elwebmaster 52d ago
Can one be opposed to age verification in the OS and yet totally happy that Meta got this fine? There is a very big difference between e2e encryption /telephone and social media. Social media is more akin to a phone book. I do not recall there ever being any phone books listing minors. That's completely unacceptable and unnecessary. I am totally OK with phonebooks (or their modern digital equivalents which enable people discovery and user generated content discovery) to abide by the same KYC rules as banks. And be only for adults. Your kids using e2e encrypted messaging to communicate with their friends whom they have met in person? Nothing wrong with that, we all have the right to privacy. Kids listing their contact information publicly? Absolute no.
[−] Aboutplants 52d ago
Why can’t penalties be tied to a percentage of Revenue?
[−] awongh 52d ago
As part of the ongoing enshittification of the internet, tragedy of the commons etc., these big centralized internet platforms decided that instead of being responsible and making their products *slightly* less terrible it was better to maximize short term engagement metrics, and that, egotistically, the chance of there being real consequences for their actions was near zero. (Or, even more cynically, that their yearly performance review was more important).

Now I'm afraid they've screwed everyone over and the idea of an anonymous open internet is now dead- we're gonna see age (read, real ID) verification gating on every site and app soon....

The dumb thing is to look back and see how umimportant it is that Facebook feed algorithm be this addictive. They already had the network effects and no real competitors. They could have just left it alone.

[−] 0ckpuppet 52d ago
the leaders of these companies don'tlet their kids use it.
[−] billfor 52d ago
and also https://news.ycombinator.com/item?id=47514916 It might be good to roll all the comments together.
[−] montroser 52d ago
Cost of doing business...
[−] muskyFelon 52d ago
Regulate and fine social media and adtech companies until its no longer economically feasible to generate the massive profits and stock valuations that is prompting this garbage.
[−] salawat 52d ago
So... Question. Seeing as Zuck is the majority voting shareholder and highest ranked executive, why isn't there a piercing of the corporate veil going on? This isn't some distributed blame case. Ultimately, his decision making led to what the jury finds objectionable. I find it absurd that somehow, the corporate veil is able to absorb even this? Somebody accepted the risk. That somebody is at the top of the pyramid. Want to send a message? Get 'em.
[−] elAhmo 52d ago
They earn this in around 16 hours.
[−] Beefin 52d ago
This is a good flag that you should be rolling your own safety checks. It's not hard, here's a writeup of an ancillary problem/solution: https://mixpeek.com/blog/ip-safety-pre-publication-clearance
[−] camillomiller 51d ago
I really would love to be in the mind of Meta spokespeople who have to craft messages that completely hide the truth, sound convincing, and have to live with it, to understand how they do it without blowing up. I think that's also quite damaging for someone's mental health.
[−] fuzzfactor 52d ago
I don't know who they have to pay it to but that's only for New Mexico, which has about two million people which works out to about $187.50 per person.

That's pretty cheap when it comes to deception.

The eyes of Texas should be upon this, which is 15X the size and should not settle for less than $1000 per person, where deceptive trade practice is much more serious than other places.

Now that would set a $30 billion example which may not be enough of a deterrent either.

But there are probably plenty of people for whom a $5000 one-time payment might not come close to being fair compensation for what's already happened, especially with Meta allowed to continue as an ongoing concern, that's got to be psychologically harmful.

To really fix it each state would have to follow "suit" while greatly upping the ante so there's at least hundreds of billions at stake.

Meta can afford it and who else is responsible for so much widespread sneaky deception at this scale for so long ?

[−] badpenny 52d ago
0.6% of last year's profits.
[−] electric_muse 52d ago
The same company intentionally driving minors towards this content (despite claiming to care about them) is also lobbying in secrecy for requiring all of us to scan our ID and face in order to use our phones and computers.

Their stated reason? Child safety.

Their actual reason? You can figure that out.

[−] nixass 52d ago
Oh no those pesky Europeans extorting money from US tech companies. No, wait..
[−] csense 52d ago
We used to believe in freedom of speech and freedom of association.

Since the dawn of the Internet era, we've had a legal principle that platforms are relatively shielded from liability for what their users do.

It's the Internet. There's sexual content and sketchy characters on it. Occasionally people will encounter them -- even if they're under 18.

Anyone who grew up in the mid-1990s or later, think back to your own Internet usage when you were under 18. You probably found something NSFW or NSFL, dealt with it, and came out basically OK after applying your common sense. Maybe it was shocking and mildly traumatizing -- but having negative experience is how we grow. Part of growing up is honing one's sense of "that link is staying blue" or "I'm not comfortable with this, it's time to GTFO". And it seems a lot safer if you encounter the sketchy side of humanity from the other side of a screen. Think about how a young person's exposure to the underbelly of humanity might have gone in pre-Internet times: Get invited to a party, find out it's in the bad part of town and there are a bunch of sketchy people there -- well, you're exposed to all kinds of physical risks. You can't leave the party as easily as you can put your phone down.

I stopped logging onto Facebook regularly around 2009; I only log in a couple times a year. I hate what Facebook has become in the past decade and a half.

But giving a site with millions of users a multi-hundred-million-dollar fine because some of those users behave badly seems...asinine.

If your kid is old enough and responsible enough to be given unsupervised Internet access, you'd better teach them how to deal with the skeevy stuff they might encounter.

[−] Alen_P 52d ago
Most Facebook users are basically teenagers, so it's no wonder it took them this long to add any real restrictions...or maybe they just wanted us to think they cared.
[−] CobrastanJorji 52d ago
"We remain confident in our record of protecting teens online," said the company that clearly was not punished enough to hurt their confidence.
[−] mattfrommars 52d ago
That’s good! We need to protect our children.

But who gets the $375 million dollars? Anyone know the cut the law firm will get from this incredible amount of money?

[−] SlightlyLeftPad 52d ago
$375M - That’s it?!
[−] bandie91 51d ago
what is so hard to teach children not to e-messaging with strangers just like not to snail-mailing with strangers? also the parents should be able to join the conversation just like in the analogue world. call me backward but i dont want to outsource parenting neither to government nor to remote businneses.
[−] paxys 52d ago
Happy to see it, but if a fine is the only consequence then they’re going to go back to doing the exact same thing tomorrow.
[−] andrewstuart 52d ago
Age verification isn’t misleading is it?
[−] zombot 52d ago
Still just a drop in the bucket compared to their quarterly profits. When will regulators get wise?
[−] tremon 52d ago
"told to pay"? As in, they're not even fined? What a horrible choice of headline.