The disturbing white paper Red Hat is trying to erase from the internet (osnews.com)

by choult 76 comments 196 points
Read article View on HN

76 comments

[−] mmh0000 34d ago
It's weird to me that this is "suddenly" an issue.

It has been known for decades that Red Hat Inc's largest customer is the U.S. Army[1]. It's a very large part of the reason why Red Hat took over development of SELinux and made it on by default in their distros.

And the Army isn't exactly known for handing out cupcakes...

[1] https://unixdigest.com/includes/files/Army-RedHat-Whitepaper...

[1] "Red Hat’s partnership with the U.S. Army spans 10 years starting with the deployment of Red Hat Enterprise Linux in 2002 and, to this day, the U.S. Army remains one of Red Hat’s largest customers by volume."

[−] greazy 33d ago
Optics. One can argue that Red Hat was working with DoD on just security. But after this white paper on how to better kill people, that facade has fallen over.

Before it was a maybe, now it's certainty.

Ambiguity is quiet comforting.

[−] scotty79 33d ago
Suddenly war and associated killings stopped being theoretical. Bunch of people that used to be as dangerous (in practice) to civilization as paintballers started actually using real weapons on real people.

Military in peacetime is cosplaying (larping?) war. So there's little resistance to aiding them in their silliness. When they actually start to bomb people, it's another story.

They deal was, we aid you in your pretend-wars, but you don't start actual ones. This deal has been violated and people don't abide by that.

[−] inhumantsar 33d ago
I'm sure all the people who died in Iraq and Afghanistan would be glad to know it was all just pretend.
[−] scotty79 33d ago
It's what happens when you stop just playing pretend even briefly, same goes for current debacle in Iran.

No reasonable person tolerates this behavior. When military does that it immediately loses leniency of smart people.

[−] BoneShard 33d ago
[−] scotty79 33d ago
only a tiny fraction of US military is engaged it this kinetic stupidity for only tiny fraction of time, the rest is larping

US military is first and foremost a welfare program for people who chose to not perform any useful economic activity.

[−] gillesjacobs 34d ago
[−] nailer 34d ago
It’s low latency for realtime. Discussion of these things have been public for twenty years since Red hat released MRG, and this is a conspiracy theory website.
[−] homeslice69 34d ago

>The focus should be on national defense, aid during disasters, and responding to the legitimate requests of sovereign, democratic nations to come to their defense (e.g. helping Ukraine fight off the Russian invasion).

Carving out the particular military engagements your company deems less than justified sounds nice but isn't workable in practice. You have to swallow the whole pill if you want to sell to the DoD.

[−] nickdothutton 34d ago
Better to have smart bombs than dumb ones. Or rather, better to have 1 smart bomb than 1000 dumb ones spread across an entire city in order to pick off the particular building, vehicle, or person you want.
[−] Qem 34d ago
Specially AI Hallucination bombs, that hit a park named "Police Park", because it thinks it's killing policemen[1], or a children school with Shahed in the name[2], because it thinks It has something to do with drones.

[1] https://x.com/MarioNawfal/status/2029575052535173364

[2] https://www.aljazeera.com/news/2026/3/6/elementary-school-in...

[−] eqvinox 34d ago
There's also a chasm of (non-)accountability.

You or your subordinates target an elementary school: that's a war crime.

Your "battlefield AI" targets an elementary school: software bug, it happens, can't be helped.

[−] thayne 34d ago
This isn't even that new. Part of the motivation for building autonomous nuclear response programs during the cold war was specifically to remove accountability, and guilt, from human operators. But AI does bring it to a new level.
[−] palmotea 33d ago

> Part of the motivation for building autonomous nuclear response programs during the cold war was specifically to remove accountability, and guilt, from human operators.

Details please. Because I can see the reality being most likely an attempt to avoid conflict by solidifying MAD, by trying to prevent a human from vetoing a second-strike.

[−] Uhhrrr 34d ago
The software is never accountable, so the human running it is always accountable.
[−] mmh0000 34d ago
"War Crimes" only apply to the loser of the war and are prosecuted by the victor.

Meaning whatever horrors are done on either side, only the horrors committed by the loser will be "crimes". The inclusion of AI doesn't change that.

[−] breppp 34d ago
Your links talk about the places that were bombed, but I don't see anything apart for conjecture that this was the product of AI targeting.

Also this is a vast underestimate of the ability of organizations that were able to locate most of Iranian leadership throughout the war in their hiding places, but suddenly their Farsi is so bad they need a twitter account to tell them this is a Park

[−] floralhangnail 33d ago
What's the running rumor right now of which AI was involved? I heard Claude awhile back, but this makes me wonder how much Redhat could have been involved?
[−] jancsika 34d ago
Channeling my inner Socrates:

You want consensus from non-experts for a plan to use 20 smart bombs.

Your opponent wants consensus for a plan to live-stream a demo of 1 smart bomb, and then use 19 dumb ones.

Your team has more expertise.

Your opponent's plan saves enough money to buy a better PR team than yours, and is still more cost effective than your plan.

Who wins?

[−] whoahwio 34d ago
That “smart” vs “dumb” distinction doesn’t apply here though. What is discussed has nothing to do with the ability to physically land a bomb in a precise location, that problem seems to be solved reasonably well already. “Smart” in this case has more to do with using ML/LLM to select a target.
[−] anigbrowl 34d ago
You can rationalize anything by only considering the upside relative to alternatives' downsides.
[−] xg15 34d ago
The reality looks more like the worst of both worlds to me.

If you genuinely needed only a handful of "surgical strikes", thete would be no need to "compress the kill cycle".

What we see in Gaza, Lebanon and Iran looks more like "smart carpet bombing": Some AI system generates a continuous stream of "targets" from sensor and intelligence data, according to whatever criteria political leadership defines and according to a given level of allowed "collateral damage", then those targets are immediately fed to drones or warplanes to destroy - essentially a continuous "pipeline" that probably "ideally" (in the dreams of those people) should become fully automated.

For THAT kind of vision, "efficiency" in destroying any particular target and checking all legally required boxes as quickly as possible is probably paramount.

(And in addition to that, there are probably still enough "dumb bombs" if no one is looking)

[−] DonHopkins 34d ago
Smart bombs are no good if they are directed by a dumb AI targeting system, a dumb alcoholic accelerationist religious fanatic Secretary of War, or a dumb narcissistic genocidal pedophile President.
[−] HeavyStorm 34d ago
You might be right, but that's terrible
[−] vzqx 34d ago
As someone who works for the DoD, the so called "disturbing" language in the paper is very commonplace in this industry. Idk if or why Red Hat is trying to redact the paper, but I'm sure it's not because they're embarrassed their software is killing people. That's par for the course for defense contractors.
[−] jnwatson 33d ago
We all have to face the reality that people are going to use our software in ways we don't like.

Any productivity improvement software in the wrong hands could make doing bad things more efficient.

[−] philipwhiuk 34d ago
I dunno that 'removes from their website' is sufficient for 'trying to erase from the Internet'

Can we rename this "RedHat removes paper from website on using their software to 'shrink the kill-chain'"

[−] bpavuk 34d ago
who let the Streisand effect out of its cage!?
[−] 1317 34d ago
"I give permission to IBM, its customers, partners, and minions, to use JSLint for evil."
[−] Qem 34d ago

> With that in mind, it seems Red Hat, owned by IBM, is desperately trying to scrub a certain white paper from the internet. Titled “Compress the kill cycle with Red Hat Device Edge”, the 2024 white paper details how Red Hat’s products and technologies can make it easier and faster to, well, kill people.

It appears IBM learned no lessons after WWII: https://en.wikipedia.org/wiki/IBM_and_the_Holocaust

That book will need a sequel soon.

[−] frumplestlatz 34d ago

> With things like the genocide in Gaza ...

Population: ~2,050,000

Density: 15,455.8/sq mi

Words have meaning, and their emotional force derives from that meaning. The knowing misuse of a term like “genocide” for its emotional force is manipulative sophistry.

[−] neilv 34d ago
Besides external PR, does anyone know how this affects internal morale?

Some of the earlier Red Hat people I knew would not be OK with working on weapons systems even under the most legitimate circumstances. And they'd be much more opposed to collaborating with fascist regimes. And I think horrified by the idea of shoveling AI slop and grifter hype into life&death decisions.

Of course the tech industry makeup has changed (overall culture transitioning from hacker idealists, to finance bros), and some IBM-ification of Red Hat has has also happened. But I'd like to think Red Hat still attracts a more principled pool of talent than FAANG.

[−] yomismoaqui 34d ago
So the hat is red because of all that blood?
[−] gameofliferetro 34d ago
Was this written by an Iranian propaganda machine?
[−] neilv 33d ago
This post looks artificially buried on page 3 now, and the topic is one of the most important things that tech company workers should be thinking about right now.

  75. Team from ETH Zurich make high quality quantum swap gate using a geometric phase (ethz.ch)
  231 points by joko42 1 day ago | flag | hide | 54 comments

  76. The disturbing white paper Red Hat is trying to erase from the internet (osnews.com)
  153 points by choult 6 hours ago | flag | hide | 51 comments

  77. Code is run more than read (2023) (olano.dev)
  137 points by facundo_olano 1 day ago | flag | hide | 102 comments
[−] SoftTalker 34d ago

> I don’t think there’s something inherently wrong with working together with your nation’s military or defense companies, but that all hinges on what, exactly, said military is doing and how those defense companies’ products are being used. The focus should be on national defense, aid during disasters, and responding to the legitimate requests of sovereign, democratic nations to come to their defense

The core purpose of a military is to destroy things and kill people, and the world is controlled by the people who can do that better than others. You can put all the "defense" and "disaster aid" lipstick on that you like but that doesn't change what they train for and what their real purpose is.