This isn’t accurate, Palantir business model includes mass surveillance for military/security purposes; if a company is concerned with privacy should think twice before handling it to Palantir, even if with all the assurances they might give in terms of data governance.
> This isn’t accurate, Palantir business model includes mass surveillance for military/security purposes;
You realize that this is not mutually exclusive with what I just wrote?
Palantir builds software for military and security purposes. But the customers don't give this data to Palantir, custody of this data remains with the customer.
> Palantir builds software for military and security purposes. But the customers don't give this data to Palantir, custody of this data remains with the customer.
How is that possible if Palantir software runs on machines Palantir controls?
Heh, the fact that they aren’t mutually exclusive is the problem. Why give someone with mass surveillance ops in other domains access to yet another domain?
This is like saying a Swiss bank would share your secrets because shady people use Swiss banks. No. Confidentiality is literally built into their business model. Getting caught sharing customer data is one of the fastest ways for their business to crumble.
> It also includes a line stating that with permission from the city agency, Palantir can “de-identify” patients’ protected health information and use it for “purposes other than research”.
Under HIPPA, "research" has a very specific definition which renders "purposes other than research" quite broad. Yes, it's "with permission" but it does depend on the city agency fully understanding what ancillary things Palantir can do with de-identified data once it has left the covered entity and without further explicit permission.
Palantir builds software that customers use to work with their own data
After DOGE, a movement Palantir aided [1], I think it's fair for folks to wonder to what degree these firms have been infiltrated by extremists. Someone who will convince themselves that exporting data to ICE or the Proud Boys—like the names of every New Yorker whose medical records say they are gay, circumcised or have had an abortion—is the right thing to do. (Or at least funny and inconsequential.)
It's a risk. Not a conclusion. But given Palantir's offering is becoming less differentiated by the day, I think it's fair for people to look for alternatives.
The concern is more with the tools that Palantir creates around the domains they service. They analyze, predict, and shape decisions using unproven technology. Palantir controls insights, models, and outcomes, and given the anti-democratic and frankly unhinged extremist worldviews of the founders, it's highly concerning to allow them to create tools for sensitive and nuanced data that have life or death consequences.
Here are their setup instructions. It seems pretty clear what is happening to your data, and an unqualified statement that you maintain some nebulous idea of "custody" seems oblivious to even simple risk.
In some regards I'd almost rather Palantir runs it, since the DoW would force them to implement very strict data isolation features which hospitals could then get for free. I wouldn't imagine Epic Healthcare Systems would be forced to isolate data so aggressively.
That said I also recognize the moral dilemma and understand why they'd pull out. Frankly I'm surprised they did much work with hospitals at all
NYC schools just passed some AI guidelines as well. No training on student PII data, no final grades, etc. Unfortunately that's a pinprick for the behemoth.
Palantir can install a data backdoor at anytime with their software. If you haven't noticed that businesses are openly violating data privacy you aren't paying attention. I don't have trust in our judicial system if Trump pardons criminals everyday.
J.D. Vance and Peter Thiel's Palantir is reportedly getting the software contract for control of Golden Dome, an orbital weapon system built by Elon Musk.
A weapon system capable of targeting any person on Earth controlled by a mass surveillance company. Wonderful.
145 comments
While you yourself think Palantir’s products are “like Excel” ?
They are not. Source: https://www.theguardian.com/news/2026/mar/26/ai-got-the-blam...
Peter Thiel shows up A LOT in those files. I don’t think it’s out of the question that he would use palantir’s data to assassinate people.
> This isn’t accurate, Palantir business model includes mass surveillance for military/security purposes;
You realize that this is not mutually exclusive with what I just wrote?
Palantir builds software for military and security purposes. But the customers don't give this data to Palantir, custody of this data remains with the customer.
It's not like tech companies deserve the benefit of the doubt when it comes to trust anymore, if they ever did
> Palantir builds software for military and security purposes. But the customers don't give this data to Palantir, custody of this data remains with the customer.
How is that possible if Palantir software runs on machines Palantir controls?
> It also includes a line stating that with permission from the city agency, Palantir can “de-identify” patients’ protected health information and use it for “purposes other than research”.
Under HIPPA, "research" has a very specific definition which renders "purposes other than research" quite broad. Yes, it's "with permission" but it does depend on the city agency fully understanding what ancillary things Palantir can do with de-identified data once it has left the covered entity and without further explicit permission.
>
Palantir builds software that customers use to work with their own dataAfter DOGE, a movement Palantir aided [1], I think it's fair for folks to wonder to what degree these firms have been infiltrated by extremists. Someone who will convince themselves that exporting data to ICE or the Proud Boys—like the names of every New Yorker whose medical records say they are gay, circumcised or have had an abortion—is the right thing to do. (Or at least funny and inconsequential.)
It's a risk. Not a conclusion. But given Palantir's offering is becoming less differentiated by the day, I think it's fair for people to look for alternatives.
[1] https://www.wired.com/story/palantir-doge-irs-mega-api-data/
hmmmmmm
> Custody of the data remains with the customer.
Yea.. like.. how, though?
Here are their setup instructions. It seems pretty clear what is happening to your data, and an unqualified statement that you maintain some nebulous idea of "custody" seems oblivious to even simple risk.
https://www.palantir.com/docs/foundry/data-connection/initia...
This isn't even getting into their "forward deployed software engineers" or how that whole aspect of their "product" works.
> Custody of the data remains with the customer
pinky promise?
76 points by fauigerzigerk 6 months ago | 7 comments
https://news.ycombinator.com/item?id=45061153
That said I also recognize the moral dilemma and understand why they'd pull out. Frankly I'm surprised they did much work with hospitals at all
A weapon system capable of targeting any person on Earth controlled by a mass surveillance company. Wonderful.
Everyone knows what's going on, but also everyone is too afraid to stand up for some reason.