Axios compromised on NPM – Malicious versions drop remote access trojan (stepsecurity.io)

by mtud 807 comments 1934 points
Read article View on HN

807 comments

[−] bob1029 46d ago
"Batteries included" ecosystems are the only persistent solution to the package manager problem.

If your first party tooling contains all the functionality you typically need, it's possible you can be productive with zero 3rd party dependencies. In practice you will tend to have a few, but you won't be vendoring out critical things like HTTP, TCP, JSON, string sanitation, cryptography. These are beacons for attackers. Everything depends on this stuff so the motivation for attacking these common surfaces is high.

I can literally count on one hand the number of 3rd party dependencies I've used in the last year. Dapper is the only regular thing I can come up with. Sometimes ScottPlot. Both of my SQL providers (MSSQL and SQLite) are first party as well. This is a major reason why they're the only sql providers I use.

Maybe I am just so traumatized from compliance and auditing in regulated software business, but this feels like a happier way to build software too. My tools tend to stay right where I left them the previous day. I don't have to worry about my hammer or screw drivers stealing all my bitcoin in the middle of the night.

[−] rhdunn 45d ago
There are several issues with "Batteries Included" ecosystems (like Python, C#/.NET, and Java):

1. They are not going to include everything. This includes things like new file formats.

2. They are going to be out of date whenever a standard changes (HTML, etc.), application changes (e.g. SQLite/PostgreSQL/etc. for SQL/ORM bindings), or API changes (DirectX, Vulcan, etc.).

3. Things like data structures, graphics APIs, etc. will have performance characteristics that may be different to your use case.

4. They can't cover all nice use cases such as the different libraries and frameworks for creating games of different genres.

For example, Python's XML DOM implementation only implements a subset of XPath and doesn't support parsing HTML.

The fact that Python, Java, and .NET have large library ecosystems proves that even if you have a "Batteries Included" approach there will always be other things to add.

[−] Groxx 45d ago
"Batteries included" means "ossification is guaranteed", yah. "stdlib is where code goes to die" is a fairly common phrase for a reason.

There's clearly merit to both sides, but personally I think a major underlying cause is that libraries are trusted. Obviously that doesn't match reality. We desperately need a permission system for libraries, it's far harder to sneak stuff in when doing so requires an "adds dangerous permission" change approval.

[−] pjmlp 44d ago

> "Batteries included" means "ossification is guaranteed", yah. "stdlib is where code goes to die" is a fairly common phrase for a reason.

Except I rather have ossified batteries that solve my problem, even if not as convinient as more modern alternatives, than not having them at all on a given platform.

[−] lokar 45d ago
Golang seems to do a good job of keeping the standard library up to date and clean
[−] Groxx 45d ago
Largely, yes.

But also everyone sane avoids the built-in http client in any production setting because it has rather severe footguns and complicated (and limited) ability to control it. It can't be fixed in-place due to its API design... and there is no replacement at this point. The closest we got was adding some support for using a Context, with a rather obtuse API (which is now part of the footgunnery).

There's also a v2 of the json package because v1 is similarly full of footguns and lack of reasonable control. The list of quirks to maintain in v2's backport of v1's API in https://github.com/golang/go/issues/71497 (or a smaller overview here: https://go.dev/blog/jsonv2-exp) is quite large and generally very surprising to people. The good news here is that it actually is possible to upgrade v1 "in place" and share the code.

There's a rather large list of such things. And that's in a language that has been doing a relatively good job. In some languages you end up with Perl/Raku or Python 2/3 "it's nearly a different language and the ecosystem is split for many years" outcomes, but Go is nowhere near that.

Because this stuff is in the stdlib, it has taken several years to even discuss a concrete upgrade. For stuff that isn't, ecosystems generally shift rather quickly when a clearly-better library appears, in part because it's a (relatively) level playing field.

[−] hu3 44d ago
This looks like an ad for batteries included to me.

Libraries also don't get it right the first time so they increment minor and major versions.

Then why is it not okay for built-in standard libraries to version their functionality also? Just like Go did with JSON?

The benefits are worth it judging by how ubiquitous Go, Java and .NET are.

I'd rather leverage billions of support paid by the likes of Google, Oracle and Microsoft to build libraries for me than some random low bus factor person, prone to be hacked at anytime due to bad security practices.

Setting up a large JavaScript or Rust project is like giving 300 random people on the internet permission to execute code on my machine. Unless I audit every library update (spoiler: no one does it because it's expensive).

[−] TheCoelacanth 44d ago
Libraries don't get it right the first time, but there are often multiple competing libraries which allows more experimentation and finding the right abstraction faster.
[−] Groxx 44d ago
Third party libraries have been avoiding those json footguns (and significantly improving performance) for well over a decade before stdlib got it. Same with logging. And it's looking like it will be over two decades for an even slightly reasonable http client.

Stuff outside stdlib can, and almost always does, improve at an incomparably faster rate.

[−] lokar 44d ago
And I think the Go people seem to do a fairly good job of picking out the best and most universal ideas from these outside efforts and folding them in.
[−] hu3 44d ago
.NET's JSON and their Kestrel HTTP server beg to differ.

Their JSON even does cross-platform SIMD and their Kestrel stack was top 10/20 on techempower benchmarks for a while without the ugly hacks other frameworks/libs use to get there.

stdlib is the science of good enough and sometimes it's far above good enough.

[−] hoppp 41d ago
Rust is especially vulnerable witb Serde included in everything and maintained by 1 perso
[−] lokar 44d ago
For me, the v2 re-writes, as well as the "x" semi-official repo are a major strength. They tell me there is a trustworthy team working on this stuff, but obviously not everything will always be as great as you might want, but the floor is rising.
[−] Groxx 41d ago
yea, I like the /x/ repos a fair bit. "first-party but unstable" is an extremely useful area to have, and many languages miss it by only having "first-party stable forever" and "third party". you need an experimentation ground to get good ideas and seek feedback, and keeping it as a completely normal library allows people/the ecosystem to choose versions the same way as any other library.
[−] Yokohiii 44d ago
Another downside of a large stdlib, is that it can be very confusing. Took my a while how unicode is supposed to work in go, as you have to track down throughout the APIs what are the right things to use. Which is even more annoying because the support is strictly binary and buried everywhere without being super explicit or discoverable.
[−] lokar 44d ago
I'm not sure I understand. Why would a standard library, a collection of what would otherwise be a bunch of independent libraries, bundled together, be more confusing than the same (or probably more) independent libraries published on their own?
[−] nazcan 44d ago
100% to libraries having permissions. If I'm using some code to say compute a hash of a byte array, it should not have access to say the filesystem nor network.
[−] diablevv 45d ago
[dead]
[−] hvb2 45d ago
The goal is not to cover everything, the goal is to cover 90% of the use cases.

For C#, I think they achieved that.

[−] zymhan 45d ago

> They are going to be out of date whenever a standard changes (HTML, etc.)

You might want to elaborate on the "etc.", since HTML updates are glacial.

[−] rhdunn 45d ago
The HTML "Living Standard" is constantly updated [1-6].

The PNG spec [7] has been updated several times in 1996, 1998, 1999, and 2025.

The XPath spec [8] has multiple versions: 1.0 (1999), 2.0 (2007), 3.0 (2014), and 3.1 (2017), with 4.0 in development.

The RDF spec [9] has multiple versions: 1.0 (2004), and 1.1 (2014). Plus the related specs and their associated versions.

The schema.org metadata standard [10] is under active development and is currently on version 30.

[1] https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/... (New)

[2] https://web.dev/baseline/2025 -- popover API, plain text content editable, etc.

[3] https://web.dev/baseline/2024 -- exclusive accordions, declarative shadow root DOM

[4] https://web.dev/baseline/2023 -- inert attribute, lazy loading iframes

[5] https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/... (Baseline 2023)

[6] https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/... (2020)

[7] https://en.wikipedia.org/wiki/PNG

[8] https://en.wikipedia.org/wiki/XPath

[9] https://en.wikipedia.org/wiki/Resource_Description_Framework

[10] https://schema.org/

[−] cpill 45d ago
please! nobody uses Xpath (coz json killed XML), it RDF (semantic web never happened, and one ever 10years is not fast), schema.org (again, nobody cares), PNG: no change in the last 26 years, not fast. the HTML "living standard" :D completely optional and hence not a standard but definition.
[−] yusaydat 45d ago
Xpath is still used for e2e tests and things like scraping. Especially when there aren't better selectors available.
[−] thomasmg 45d ago
The point is that you don't need the very latest version. The 20 years old version is enough.
[−] rhdunn 45d ago
XPath 1.0 is a pain to write queries for. XPath 2.0 adds features that make it easier to write queries. XPath 3.1 adds support for maps, arrays, and JSON.

And the default Python XPath support is severely limited, not even a full 1.0 implementation. You can't use the Python XPath support to do things like element[contains(@attribute, 'value')] so you need to include an external library to implement XPath.

[−] thomasmg 36d ago
The problem of XPath 3.1 is that it is very complex [1] - this is a long page. Compared to the 1.0 spec, it is just too complex in my view. For the open source project I'm working on, I never felt that the "old" XPath version is painful.

I think simplicity is better. That's why Json is used nowadays, and XML is not.

[1] https://www.w3.org/TR/xpath-31/ [2] https://jackrabbit.apache.org/oak/docs/query/grammar-xpath.h...

[−] rhdunn 45d ago
XPath is used in processing XML (JATS and other publishing/standards XML files) and can be used to proces HTML content.

RDF and the related standards are still used in some areas. If the "Batteries Included" standard library ignores these then those standards will need an external library to support them.

Schema.org is used by Google and other search engines to describe content on the page such as breadcrumbs, publications, paywalled content, cinema screenings, etc. If you are generating websites then you need to produce schema.org metadata to improve the SEO.

Did you notice that a new PNG standard was released in 2025 (last year, with a working draft in 2022) adding support for APNG, HDR, and Exif metadata? Yes, it hasn't changed frequently, but it does change. So if you have PNG support in the standard library you need to update it to support those changes.

And if HTML support is optional then you will need an external library to support it. Hence a "Batteries Included" standard library being incomplete.

[−] hoppp 41d ago
Its used plenty in legacy systems that are still around.
[−] mrits 45d ago
glaciers change faster than HTML
[−] Yokohiii 44d ago
because there is more human effort
[−] zymhan 45d ago
Oof, I honestly hadn't considered that.
[−] CommonGuy 45d ago
Why would they be out of date? The ecosystems themselves (for example .NET) receives regular updates.

Yes, they cannot include everything, but enough that you do not _need_ third party packages.

[−] mrits 45d ago
Python, .NET, and Java are not examples of batteries included.

Django and Spring

[−] spixy 45d ago
comparing to Node, .NET is batteries included: built-in Linq vs needing lodash external package, built-in Decimal vs decimal.js package, built-in model validation vs class-validator & class-transformer packages, built-in CSRF/XSRF protection vs csrf-csrf package, I can go on for a while...
[−] ytpete 45d ago
And in fact wasn't a popular Python library just compromised very recently? See https://news.ycombinator.com/item?id=47501426.

So Python's clearly not "batteries included" enough to avoid this kind of risk.

[−] rhdunn 45d ago
That's my point. You can have a large standard library like those languages I mentioned, but that isn't going to include everything nor cover every use case, so you'll have external libraries (via PyPi for Python, NuGet for .NET, and Maven for Java/JVM).
[−] vovavili 45d ago
Python's standard library is definitely much more batteries-included than JavaScript's.
[−] la_fayette 45d ago
depends, JavaScript in the Browser has many useful things available, which I miss with python, e.g., fetch, which in Python you need a separate package like requests to avoid a clunky API. Java had this issue for long time as well, since Java 11 there is the HttpClient with a convenient API.
[−] wongarsu 46d ago

> In practice you will tend to have a few, but you won't be vendoring out critical things like HTTP, TCP, JSON, string sanitation, cryptography

Unless you are Python, where the standard library includes multiple HTTP libraries and everyone installs the requests package anyways.

Few languages have good models for evolving their standard library, so you end up with lots of bad designs sticking around forever. Libraries are much easier to evolve, giving them the advantage in terms of developer UX and performance.

[−] paintbox 45d ago
What type of developer chooses UX and performance over security? So reckless.

I removed the locks from all the doors, now entering/exiting is 87% faster! After removing all the safety equipment, our vehicles have significantly improved in mileage, acceleration and top speed!

[−] integralid 45d ago

>What type of developer chooses UX and performance over security? So reckless.

Initially I assumed this is sarcastic, but apparently not. UX and performance is what programmers are paid to do! Making sure UX is good is one of the most important things in programmer job.

While security is a moving target, a goal, something that can never be perfect, just "good enough" (if NSA wants to hack you, they will). You make it sound like installing third party packages is basically equivalent to a security hole, while in practice the risk is low, especially if you don't overdo it.

Wild to read extreme security views like that, while at the same time there are people here that run unconstrained AI agents with --dangerous-skip-confirm flags and see nothing wrong with it.

[−] toss1 45d ago
Even more wild to read that sarcasm about "removing locks from doors for 87% speedup" is considered extreme...

And yes, we agree that running unconstrained AI agents with --dangerous-skip-confirm flags and seeing nothing wrong with it is insane. Kind of like just advertising for burglars to come open your doors for you before you get home - yeah, it's lots faster to get in (and to move about the house with all your stuff gone).

[−] zymhan 45d ago
Installing 3rd party packages the way Node and Python devs do regularly _is_ a security hole.
[−] wongarsu 45d ago
Better developer UX can directly lead to better safety. "You are holding it wrong" is a frequent source of security bugs, and better UX reduces the ways you can hold it wrong, or at least makes you more likely to hold it the right way
[−] duskdozer 45d ago
"Security" is often more about corporate CYA than improving my actual security as a user, and sometimes in opposition, and there is often blatant disregard for any UX concession at all. The most secure system is fully encrypted with all copies of the encryption key erased.
[−] seunosewa 45d ago
requests should be in the Python standard library. Hard choices need to be made.
[−] ptx 46d ago
I'm pretty sure it's really one HTTP library: urllib.request is built on top of http.client. But the very Java-inspired API for the former is awful.
[−] nicce 45d ago

> Unless you are Python, where the standard library includes multiple HTTP libraries and everyone installs the requests package anyways.

The amount of time defining same data structures over and over again vs pip install requests with well defined data structures.

[−] throwaway2037 45d ago

    >  Few languages have good models for evolving their standard library
Can you name some examples?
[−] afavour 46d ago
Irony is that Node has no need for Axios, native fetch support has been there for years, so in terms of network requests it is batteries included.
[−] dec0dedab0de 45d ago
Batteries included systems are still susceptible to supply chain attacks, they just move slower so it’s not as attractive of a target.

I think packages of a certain size need to be held to higher standards by the repositories. Multiple users should have to approve changes. Maybe enforced scans (though with trivy’s recent compromise that wont be likely any time soon)

Basically anything besides lone developer can decide to send something out on a whim that will run on millions of machines.

[−] zdc1 46d ago
The other thing that keeps coming up is the github-code-is-fine-but-the-release-artifact-is-a-trojan issue. It really makes me question if "packages" should even exist in JavaScript, or if we could just be importing standard plain source code from a git repo.

I understand why this doesn't work well with legacy projects, but it's something that the language could strive towards.

[−] cozzyd 45d ago
or you don't use a package manager where anyone can just publish a package (i.e. use your system package manager). There is still some risk, but it is much smaller. Like, if xz were distributed by PyPI or NPM, everyone would have been pwned, but instead it was (barely) found.

It's true that system repos doesn't include everything, but you can create your own repositories if you really need to for a few things. In practice Fedora/EPEL are basically sufficient for my needs. Right now I'm deploying something with yocto, which is a bit more limited in slection, but it's pretty easy to add my own packages and it at least has hashes so things don't get replaced without me noticing (to be fair, I don't know if the security practices of open-embedded recipes are as strong as Fedora...).

[−] mrsmrtss 46d ago
Fully agree with this! I think today .NET is probably the most batteries included platform you can get. This means that even if you use third-party libraries, these typically depend only on first-party dependencies, making it much less likely for something shady to sneak in.
[−] Lord_Zero 45d ago
So, youre on Microsoft then, judging by ScottPlot you write .NET desktop apps. If you use Dapper, you probably use Microsoft.Data.SqlClient, which is... distributed over NuGet and vulnerable to supply chain attack. You may not need many deps as a desktop dev. Modern day line of business apps require a lot more deps. CSVHelper, ClosedXML, AutoMapper, WebOptimizer, NetEscapades.AspNetCore.SecurityHeaders.

Yes less deps people need the better but it doesn't fix trhe core problem. Sharing and distrib uting code is a key tenant of being able to write modern code.

[−] tliltocatl 45d ago
I agree that dependencies are a liability, but, sadly, "batteries included" didn't work out for Python in practice (i. e. how do I even live without numpy? No, array aren't enough).
[−] h4ch1 46d ago
I can't even imagine the scale of the impact with Axios being compromised, nearly every other project uses it for some reason instead of fetch (I never understood why).

Also from the report:

> Neither malicious version contains a single line of malicious code inside axios itself. Instead, both inject a fake dependency, plain-crypto-js@4.2.1, a package that is never imported anywhere in the axios source, whose only purpose is to run a postinstall script that deploys a cross-platform remote access trojan (RAT)

Good news for pnpm/bun users who have to manually approve postinstall scripts.

[−] postalcoder 46d ago
PSA: npm/bun/pnpm/uv now all support setting a minimum release age for packages.

I also have ignore-scripts=true in my ~/.npmrc. Based on the analysis, that alone would have mitigated the vulnerability. bun and pnpm do not execute lifecycle scripts by default.

Here's how to set global configs to set min release age to 7 days:

  ~/.config/uv/uv.toml
  exclude-newer = "7 days"

  ~/.npmrc
  min-release-age=7 # days
  ignore-scripts=true
  
  ~/Library/Preferences/pnpm/rc
  minimum-release-age=10080 # minutes
  
  ~/.bunfig.toml
  [install]
  minimumReleaseAge = 604800 # seconds
(Side note, it's wild that npm, bun, and pnpm have all decided to use different time units for this configuration.)

If you're developing with LLM agents, you should also update your AGENTS.md/CLAUDE.md file with some guidance on how to handle failures stemming from this config as they will cause the agent to unproductively spin its wheels.

[−] himata4113 46d ago
I recommend everyone to use bwrap if you're on linux and alias all package managers / anything that has post build logic with it.

I have bwrap configured to override: npm, pip, cargo, mvn, gradle, everything you can think of and I only give it the access it needs, strip anything that is useless to it anyway, deny dbus, sockets, everything. SSH is forwarded via socket (ssh-add).

This limits the blast radius to your CWD and package manager caches and often won't even work since the malware usually expects some things to be available which are not in a permissionless sandbox.

You can think of it as running a docker container, but without the requirement of having to have an image. It is the same thing flatpak is based on.

As for server deployments, container hardening is your friend. Most supply chain attacks target build scripts so as long as you treat your CI/CD as an untrusted environment you should be good - there's quite a few resources on this so won't go into detail.

Bonus points: use the same sandbox for AI.

Stay safe out there.

[−] woodruffw 45d ago
There’s a recurrent pattern with these package compromises: the attacker exfiltrates credentials during an initial phase, then pivots to the next round of packages using those credentials. That’s how we saw them make the Trivy to LiteLLM leap (with a 5 day gap), and it’ll almost certainly be similar in this case.

The solution to this is twofold, and is already implemented in the primary ecosystems being targeted (Python and JS): packagers should use Trusted Publishing to eliminate the need for long lived release credentials, and downstreams should use cooldowns to give security researchers time to identify and quarantine attacks.

(Security is a moving target, and neither of these techniques is going to work indefinitely without new techniques added to the mix. But they would be effective against the current problems we’re seeing.)

[−] vsgherzi 46d ago
Not to beat a dead horse but I see this again and again with dependencies. Each time I get more worried that the same will happen with rust. I understand the fat std library approach won’t work but I really still want a good solution where I can trust packages to be safe and high quality.
[−] wps 46d ago
Genuinely how are you supposed to make sure that none of the software you have on your system pulls this in?

It’s things like this that make me want to swap to Qubes permanently, simply as to not have my password manager in the same context as compiling software ever.

[−] a13n 45d ago
Rejecting any packages newer than X days is one nice control, but ultimately it'd be way better to maintain an allowlist of which packages are allowed to run scripts.

Unfortunately npm is friggen awful at this...

You can use --ignore-scripts=true to disable all scripts, but inevitably, some packages will absolutely need to run scripts. There's no way to allowlist specific scripts to run, while blocking all others.

There are third-party npm packages that you can install, like @lavamoat/allow-scripts, but to use these you need to use an entirely different command like npm setup instead of the npm install everyone is familiar with.

This is just awful in so many ways, and it'd be so easy for npm to fix.

[−] tkel 46d ago
JS package managers (pnpm, bun) now will ignore postinstall scripts by default. Except for npm, it still runs them for legacy reasons.

You should probably set your default to not run those scripts. They are mostly unnecessary.

  ~/.npmrc :
  ignore-scripts=true

83M weekly downloads!
[−] jadar 46d ago
How much do you want to bet me that the credential was stolen during the previous LiteLLM incident? At what point are we going to have to stop using these package managers because it's not secure? I've got to admit, it's got me nervous to use Python or Node.js these days, but it's really a universal problem.
[−] socketcluster 45d ago
I've been advocating to minimize the number of dependencies for some time now. Once you've maintained an open source project for several years, you start to understand the true cost of dependencies and you actually start to pay attention to the people behind the libraries. Popularity doesn't necessarily mean reliable or trustworthy or secure.
[−] strogonoff 46d ago
Essential steps to minimise your exposure to NPM supply chain attacks:

— Run Yarn in zero-installs mode (or equivalent for your package manager). Every new or changed dependency gets checked in.

— Disable post-install scripts. If you don’t, at least make sure your package manager prompts for scripts during install, in which case you stop and look at what it’s going to run.

— If third-party code runs in development, including post-install scripts, try your best to make sure it happens in a VM/container.

— Vet every package you add. Popularity is a plus, recent commit time is a minus: if you have this but not that, keep your eyes peeled. Skim through the code on NPM (they will probably never stop labelling it as “beta”), commit history and changelog.

— Vet its dependency tree. Dependencies is a vector for attack on you and your users, and any new developer in the tree is another person you’re trusting to not be malicious and to take all of the above measures, too.

[−] nananana9 46d ago
Package managers are a failed experiment.

We have libraries like SQLite, which is a single .c file that you drag into your project and it immediately does a ton of incredibly useful, non-trivial work for you, while barely increasing your executable's size.

The issue is not dependencies themselves, it's transitive ones. Nobody installs left-pad or is-even-number directly, and "libraries" like these are the vast majority of the attack surface. If you get rid of transitive dependencies, you get rid of the need of a package manager, as installing a package becomes unzipping a few files into a vendor/ folder.

There's so many C libraries like this. Off the top of my head, SQLite, FreeType, OpenSSL, libcurl, libpng/jpeg, stb everything, zlib, lua, SDL, GLFW... I do game development so I'm most familiar with the ones commonly used in game engines, but I'm sure other fields have similarly high quality C libraries.

They also bindings for every language under the sun. Rust libraries are very rarely used outside of Rust, and C#/Java/JS/Python libraries are never used outside their respective language (aside form Java ones in other JVM langs).

[−] crimsonnoodle58 45d ago
Setting min-release age to 7 days is great, but the only true way to protect from supply chain attacks is restricting network access.

This needs to be done (as we've seen from these recent attacks) in your devenv, ci/cd and prod environments. Not one, or two, but all of these environments.

The easiest way is via using something like kubernetes network policies + a squid proxy to allow limited trusted domains through, and those domains must not be publicly controllable by attackers. ie. github.com is not safe to allow, but raw.githubusercontent.com would be as it doesn't allow data to be submitted to it.

Network firewalls that perform SSL interception and restrict DNS queries are an option also, though more complicated or expensive than the above.

This stops both DNS exfil and HTTP exfil. For your devenv, software like Little Snitch may protect your from these (I'm not 100% on DNS exfil here though). Otherwise run your devenv (ie vscode) as a web server, or containerised + vnc, a VM, etc, with the same restrictions.

[−] xinayder 45d ago
Very detailed and props to the security researchers, but the blog post has several indicators that it was written by AI, to which point I suspect their malware analysis was also done by a LLM.

I just wish it had more human interaction rather than have a GenAI spit out the blog post. It's very repetitive and includes several EM dashes.

[−] jmward01 46d ago
This may not be popular, but is there a place for required human actions or just timed actions to slow down things like this? For instance, maybe a GH action to deploy requires a final human click and to change that to cli has a 3 day cooling period with mandatory security emails sent out. Similarly, you switch to read only for 6 hrs after an email change. There are holes in these ideas but the basic concept is to treat security more like physical security, your goal isn't always to 100% block but instead to slow an attacker for xxx minutes to give the rest of the team time to figure out what is going on.
[−] croemer 45d ago
Has anyone else noticed there was a recent sudden flurry of 3000 deleted issues on axios/axios? The jump happened on March 23. Was this a first sign of compromise? Or just coincidence of an AI agent going rogue.

There are pretty much exactly 3000 deleted issues, with the range starting at https://github.com/axios/axios/issues/7547 (7547) and ending at https://github.com/axios/axios/issues/10546 (10546 which is 7547+2999)

Maybe just a coincidence but they have cubic-dev-ai edit every single PR with a summary. And that bot edits PR descriptions even for outside contributors.

[−] woeirua 46d ago
Supply chain attacks are so scary that I think most companies are going to use agents to hard fork their own versions of a lot of these core libraries instead. It wasn’t practical before. It’s definitely much more doable today.
[−] red_admiral 46d ago
There's a package manager discussion, but the bit that stands out to me is that this started with a credential compromise. At some point when a project gets big enough like axios, maybe the community could chip in to buy the authors a couple of YubiHSM or similar. I wish that _important keys live in hardware_ becomes more standard given the stakes.

Dealing with dependencies is another question; if it's stupid stuff like leftpad then it should be either vendored in or promoted to be a language feature anyway (as it has been).

[−] mr_bob_sacamano 45d ago
# If you have a projects folder containing multiple projects on macOS, you can run this script to recursively scan all subfolders for vulnerable axios versions and the presence of plain-crypto-js, helping you quickly identify potentially affected projects:

find . -name "package.json" -exec sh -c ' dir=$(dirname "{}") echo "==== $dir ====" cd "$dir" npm list axios 2>/dev/null | grep -E "1\.14\.1|0\.30\.4" grep -A1 "\"axios\"" package-lock.json 2>/dev/null | grep -E "1\.14\.1|0\.30\.4" [ -d node_modules/plain-crypto-js ] && echo "POTENTIALLY AFFECTED" ' \;

[−] _lvbh 46d ago
There are so many scanners these days these things get caught pretty quick. I think we need either npm or someone else to have a registry that only lets through packages that pass these scanners. Can even do the virustotal thing of aggregating reports by multiple scanners. NPM publishes attestation for trusted build environments. Google has oss-rebuild.

All it takes is an npm config set to switch registries anyways. The hard part is having a central party that is able to convince all the various security companies to collaborate rather than having dozens of different registries each from each company.

Rather than just a hard-coded delay, I think having policies on what checks must pass first makes sense with overrides for when CVEs show up.

(WIP)

[−] mcintyre1994 46d ago
The frustrating thing here is that axios versions display on npmjs with verified provenance. But they don’t use trusted publishing: https://github.com/axios/axios/issues/7055 - meaning the publish token can be stolen.

I wrongly thought that the verified provenance UI showed a package has a trusted publishing pipeline, but seems it’s orthogonal.

NPM really needs to move away from these secrets that can be stolen.

[−] majorbugger 46d ago
Good morning, or as they say in the NPM world, which package got compromised today?
[−] TheTaytay 46d ago
I know there is a cooldown period for npm packages, but I’m beginning to want a cooldown for domains too. According to socket, the C2 server is sfrclak[.]com, which was registered in the last 24 hours.
[−] yoyohello13 46d ago
This is just going to get worse and worse as agentic coding gets better. I think having a big dependency tree may be a thing of the past in the coming years. Seems like eventually new malware will be coming out so fast it will basically be impossible to stop.
[−] Bridged7756 45d ago
At this point picking Node for a backend is a foot gun. Large companies have the funds for private, security vetted npm repositories, but what about small companies, startups, personal projects? Pnpm makes things more secure without install scripts, mininum package time, but it's still the same activity, does an extra parachute make skydiving any less inherently dangerous?

I'm not dogmatic about the whole "JS for the backend is sin" from backend folks, but it seems like it was the right call. You should stick to large org backed packages, or languages with good enough standard libraries, like Go, Java, Python, C#.

[−] raphinou 46d ago
I'm working on a multi signature solution that helps to detect unauthorized releases in the case of an account hijack. It is open source, self hostable, accountless and I am looking for feedback!

Website: https://asfaload.com/

GitHub:https://github.com/asfaload/asfaload

Spec: https://github.com/asfaload/spec

[−] Hackbraten 46d ago
I am now migrating all my unencrypted secrets on my machines to encrypted ones. If a tool supports scripted credential providers (e.g. aws-cli or Ansible), I use that feature. Otherwise, I wrap the executable with a script that runs gpg --decrypt and injects an environment variable.

That way, I can at least limit the blast radius when (not if) I catch an infostealer.

[−] 6thbit 45d ago
I don’t buy the “wait 7 days” being thrown around as a guard.

Wouldn’t that just encourage the bad actors to delay the activation of their payloads a few days or even remotely activated on a switch?

[−] marjipan200 46d ago
[−] rawgabbit 45d ago
[−] dryarzeg 45d ago
(A bit off-topic; half-joking, half-serious)

What a great time to be alive! Now, that's exactly why I enjoy writing software with minimal dependencies for myself (and sometimes for my family and friends) in my spare time - first, it's fun, and second, turns out it's more secure.

[−] pjmlp 46d ago
The amount of people still using this instead of fetch. Nonetheless when wasn't axios, it would be something else.

This is why corporations doing it right don't allow installing the Internet into dev machines.

Yet everyone gets to throw their joke about PC virus, while having learnt nothing from it.

[−] aa-jv 46d ago
I have a few projects which rely on npm (and react) and every few months I have to revisit them to do an update and make sure they still build, and I am basically done with npm and the entire ecosystem at this point.

Sure, its convenient to have so much code to use for basic functionality - but the technical debt of having to maintain these projects is just too damn high.

At this point I think that, if I am forced to use javascript or node for a project, I reconsider involvement in that project. Its ecosystem is just so bonkers I can't justify the effort much longer.

There has to be some kind of "code-review-as-a-service" that can be turned on here to catch these things. Its just so unproductive, every single time.

[−] pier25 45d ago
PSA from the Claude Code leaks it looks like it's using Axios (although an older version)
[−] fluxist 46d ago
A command to recursively check for the compromised axios package version:

   find / -path '*/node_modules/axios/package.json' -type f 2>/dev/null | while read -l f; set -l v (grep -oP '"version"\s*:\s\*"\K(1\.14\.1|0\.30\.4)' $f 2>/dev/null); if test -n "$v"; printf '\a\n\033[1;31m FOUND v%s\033[0m  \033[1;33m%s\033[0m\n' $v (string replace '/package.json' '' -- $f); else; printf '\r\033[2m scanning: %s\033[K\033[0m' (string sub -l 70 -- $f); end; end; printf '\r\033[K\n\033[1;32m scan complete\033[0m\n'
[−] koolba 46d ago

> Both versions were published using the compromised npm credentials of a lead axios maintainer, bypassing the project's normal GitHub Actions CI/CD pipeline.

Doesn’t npm mandate 2FA as of some time last year? How was that bypassed?

[−] Surac 46d ago
All these supply chain attacks make me nervous about the apps I use. It would be valuable info if an app used such dependencies, but on the other hand, programmers would cut their sales if they gave you this info.
[−] bluepeter 46d ago
Min release age sucks, but we’ve been here before. Email attachments used to just run wild too, then everyone added quarantine delays and file blocking and other frictions... and it eventually kinda/sorta worked. This does feel worse, though, with fewer chokepoints and execution as a natural part of the expectation.

Edit: bottom line is installs are gonna get SOOO much more complicated. You can already see the solution surface... Cooling periods, maintainer profiling, sandbox detonation, lockfile diffing, weird publish path checks. All adds up to one giant PITA for fast easy dev.

[−] zar1048576 46d ago
In case it helps, we open-sourced a tool to audit dependencies for this kind of supply-chain issue. The motivation was that there is a real gap between classic “known vulnerability” scanning and packages whose behavior has simply turned suspicious or malicious. We also use AI to analyze code and dependency changes for more novel or generic malicious behavior that traditional scanners often miss.

Project: https://point-wild.github.io/who-touched-my-packages/

[−] wolvesechoes 46d ago
I am glad I don't need to touch JS or web dev at all.

Now, I tend to use Python, Rust and Julia. With Python I am constantly using few same packages like numpy and matplotlib. With Rust and Julia, I try as much as possible to not use any packages at all, because it always scares me when something that should be pretty simple downloads half of the Internet to my PC.

Julia is even worse than Rust in that regard - for even rudimentary stuff like static arrays or properly namespaced enums people download 3rd party packages.

[−] tonymet 45d ago
1/5 of your CLI and 1/3 of your gui apps are npm based. Each one has 400+ dependencies , none notable enough to go viral when they are breached. And who knows what other packages are currently compromised. We all have 30+ node_modules on our disks, and 2/3 of them were shipped by outside vendors , packaged in an archive.

“I’m smart I use fetch instead of axios”. “I pin my versions” – sure but certainly one of your npx or Electron apps uses axios or another less notably compromised package.

Let’s

[−] twodave 45d ago
How is it we've made it this far and we still don't have any kind of independent auditing of basic publish security on NPM? You'd think this would be collectively a trivial and high priority task (to ensure that all publishes for packages over a certain download volume are going through a session that authenticated via MFA, for instance).
[−] cleansy 46d ago
To have an initial smoke test, why not run a diff between version upgrades, and potentially let an llm summarise the changes? It’s a baffling practice that a lot of developers are just blindly trusting code repos to keep the security standards. Last time I installed some npm package (in a container) it loaded 521 dependencies and my heart rate jumped a bit
[−] lepuski 46d ago
I believe compartmentalized operating systems like Qubes are the future for defending against these kinds of attacks.

Storing your sensitive data on a single bare-metal OS that constantly downloads and runs packages from unknown maintainers is like handing your house key out to a million people and hoping none of them misuse it.

[−] carlbarrdahl 45d ago
Many of the suggestions in this thread (min-release, ignore script) are defenses for the consumers.

I've been working on Proof of Resilience, a set of 4 metrics for OSS, and using that as a scoring oracle for what to fund.

Popularity metrics like downloads, stars, etc are easy to fake today with ai agents. An interesting property is that gaming these metrics produces better code, not worse.

These are the 4 metrics:

1. Build determinism - does the published artifact match a reproducible build from source?

2. Fuzzing survival - does the package survive fuzz testing?

3. Downstream stability - does it break any repos dependent on this project when pushing a release?

4. Patch velocity - how fast are fixes merged?

Here's a link to the post, still early but would appreciate any feedback.

https://hackmd.io/@carlb/proof-of-resilience

[−] tonymet 46d ago
Has anyone tested general purpose malware detection on supply chains ? Like clamscan . I tried to test the LiteLLM hack but the affected packages had been pulled. Windows Defender AV has an inference based detector that may work when signatures have not yet been published
[−] mtud 46d ago
Supply chain woes continue