It’s been a brutal two weeks for the open source supply chain. On March 19, the Trivy vulnerability scanner, the tool many teams rely on to detect supply chain attacks, was itself compromised (oh, the irony). Five days later, a cascading effect from the Trivy breach led directly to the compromise of LiteLLM, one of the most popular Python packages for working with LLMs. Then on March 31, the Axios npm package, with over 100 million weekly downloads, was backdoored with a remote access trojan. Trivy is the most widely used open-source vulnerability scanner in the cloud-native ecosystem, with more than 32,000 GitHub stars and more than 100 million Docker Hub pulls. This may not be over yet.
What’s striking is not just the pace, but the pattern. In the Trivy and LiteLLM cases, the source code on GitHub was never tampered with. Instead, attackers compromised CI/CD pipelines and publishing infrastructure, so the packages themselves were poisoned on their way to the repositories you install from. The Axios and LiteLLM attacks went a step further: the attacker directly hijacked a maintainer’s account via stolen credentials and published malicious versions that bypassed the project’s CI/CD entirely. In all three cases, the packages looked legitimate. They came from the expected accounts, through the usual channels.
At Anaconda, protecting the Python software supply chain is foundational to how we build and distribute software. In this post, we’ll walk through the specific mechanisms we’ve built into the Anaconda Distribution to defend against exactly these attack vectors. And while our distribution is focused on Python and conda packages rather than Docker images or npm, the architectural principles apply broadly.
Protecting Against CI/CD Compromise
The challenge facing most open source maintainers is that shipping packages means owning CI/CD infrastructure on top of maintaining the library itself. Every individual maintainer is responsible for securing their own test and deployment pipelines, and that’s exactly the vector these attacks exploited. This isn’t a case of irresponsibility or bad intent. It’s a resource problem: not enough hours in the day.
The Trivy-to-LiteLLM cascade illustrates this perfectly. TeamPCP, the threat group behind the Trivy attack, didn’t need to attack LiteLLM directly. They compromised Trivy, which was running as a vulnerability scanner inside LiteLLM’s CI pipeline without version pinning. That single unmanaged dependency handed over LiteLLM’s PyPI publishing credentials, and from there the attacker backdoored a package with roughly 3.4 million downloads per day.
The Axios compromise took a different but equally dangerous path. As Simon Willison has pointed out, Axios already had a GitHub issue (#7055) from September 2025 recommending they adopt npm’s trusted publishing mechanism. With hundreds of open issues in the repo, it’s easy for that kind of signal to get buried. The attacker used a stolen long-lived npm access token to publish directly to the registry, bypassing the project’s GitHub Actions workflow entirely. Every legitimate Axios release included OpenID Connect (OIDC) provenance metadata and Supply-chain Levels for Software Artifacts (SLSA) build attestations; the malicious versions had none. But npm didn’t enforce that as a gate.
As Snowflake Field CTO Kevin Keller pointed out in a recent analysis of Python and conda supply chain security, package signing alone wouldn’t have prevented the LiteLLM attack. The attacker published through the legitimate pipeline with compromised credentials, so the package could have been validly signed. The signature would have proven provenance; the provenance was the problem. This is why layered defenses matter, and it’s central to how Anaconda approaches security.
Anaconda builds and hosts everything in the Anaconda Distribution on our secure infrastructure. Our security team is continually monitoring, maintaining, and upgrading our infrastructure to protect against these types of attacks. As outlined in our security practices, source code and built artifacts are maintained with strict chain-of-control and are built, scanned, and hashed on a separate secure network within Anaconda, with access limited to a small number of vetted engineers. The attack that compromised LiteLLM, a scanner running loose in someone else’s pipeline, has no equivalent in our build environment because we own and control the entire pipeline ourselves.
Eliminating Typosquatting and Dependency Confusion
On public repositories, anything can be uploaded by anyone with almost any name. This is not a flaw; this is what has enabled the ecosystem to flourish as friction has been removed for anyone to share what they’ve made. But bad actors have exploited this openness to distribute malware. Two key examples are typosquatting, where a malicious package is uploaded with a very similar name to a legitimate package (such as requ3sts for the requests package), and dependency confusion, where a malicious package with the same name as a private corporate package is uploaded to public servers. In both cases, the attacker is counting on a typo or a package resolver’s preference for public sources to execute their malicious code.
This continues to be a significant threat vector. In 2024, security researchers identified over 500 typosquatted variations of popular Python libraries on PyPI in a single coordinated campaign, targeting names like requests, TensorFlow, and BeautifulSoup. In 2025, a malicious package called termncolor (mimicking the legitimate termcolor) was used to deliver a remote access trojan, as were packages like sisaws (typosquatting the Argentine health system integration library sisa). With the rise of LLM-powered coding, a new variation called “slopsquatting” has emerged, where attackers register package names that LLMs commonly hallucinate. An AI coding tool that suggests a nonexistent package creates an opening for an attacker who registers that name first.
Anaconda’s distribution is fully curated. Every package is selected by our team, verified against a reputable source, and checked for legitimate community backing before it ever reaches our repositories. Typosquatting and dependency confusion are threats you don’t need to worry about. What you install from Anaconda is what we deliberately chose to put there.
Seeing Into Compiled Libraries
It’s quite common in the Python ecosystem for packages to include compiled components within the package. The standard Python package format, called a wheel, is Python-specific, so any non-Python compiled components are bundled into the package. This is called “vendoring”; our internal analysis has found that more than 532,105 wheels across 5,332 projects on PyPI include a vendored library. That analysis also showed a common library like openssl is included in 70,872 wheels across 802 projects. This can have two significant security impacts: First, many package and code scanners can’t see inside those binaries for vulnerabilities. Secondly, if there’s a discovered vulnerability in that compiled package, every single package that has included that compiled package now needs to issue an update to pull in the latest. In that openssl example, that means more than 800 projects may need to update their package when an openssl vulnerability is found.
To solve this, Anaconda is standardized on the conda package format, which can package anything, not just Python libraries. We don’t vendor other packages into a conda package. Instead, these dependencies are split into their own packages, each of which is also built by us. This allows us to mitigate both issues cleanly.
Because we build each of these binaries, we see the source code and can ensure it hasn’t been tampered with. When a vulnerability is found in openssl, you don’t need to update every package that depends on it. You just need to update openssl itself. As Kevin Keller noted in his recent post on conda supply chain security, building from source in controlled environments combined with TUF-based package signing gives enterprises a curated security layer that public registries simply don’t provide.
This architectural decision is what sets Anaconda and the conda ecosystem apart from wheel-based workflows. In the pip/wheel world, a package like cryptography vendors compiled openssl binaries inside its wheel. If a vulnerability is found in that openssl version, every package that vendors it needs to issue a new release. With conda, you update the openssl package once and every package that depends on it picks up the fix. In the context of these recent attacks, that difference is the difference between a surgical update and a fire drill.
Avoiding Latest Version Bugs
Many of the vulnerabilities we’ve seen lately are only publicly available for a short window before being detected. LiteLLM’s malicious versions were live for roughly 40 minutes before PyPI quarantined the package, though that was long enough for over 40,000 downloads. The Axios compromise lasted two to three hours. This has led community voices like William Woodruff and Andrew Nesbitt to advocate for “dependency cooldowns” in package managers, where a tool will only install packages that have been published for some minimum period of time. Five package managers have shipped cooldown support in the last six months, including pip, npm, pnpm, and uv. The conda community is also working on adding cooldown support.
Cooldowns are a smart approach for the public ecosystem, but Anaconda goes a step further. Since we curate and build every package in our distribution, we don’t release as soon as the upstream maintainer (or bad actor) does. When a new release occurs, we first analyze it and determine whether to accept and build this version. This isn’t a time-based delay; it’s a deliberate human review gate. That delay alone would have protected against the Trivy, LiteLLM, and Axios incidents. But we also test everything we build. If we see discrepancies in our secure environment, we won’t release the package.
It’s worth noting the distinction: a cooldown is a client-side timer that hopes the community catches a problem before the clock runs out. Anaconda’s approach is a curated release gate where Anaconda engineers review, build, and test before anything reaches our repositories. Both reduce risk. Ours doesn’t rely on timing alone.
Surfacing CVEs and Known Vulnerabilities
There are some vulnerabilities that don’t start with malicious intent. Sometimes a legitimate bug in a trusted package gets exploited after it is released, and by then it may already be widely deployed and difficult to remove. Plenty of tools exist to scan and block packages for known vulnerabilities. Anaconda goes further.
Anaconda’s enterprise products surface known vulnerabilities for each package and lets you filter them. We also curate the vulnerability reports ourselves, verifying that a reported CVE actually impacts the specific package we’ve built, thereby reducing false positives. Since we build our binary packages and don’t vendor them into other packages, we can separate out just the vulnerable dependency, not every package. We also provide environment logging so you can see exactly who is using which package across your enterprise. With Anaconda, you’re better protected and not slowed down by false alarms.
Additional Steps You Can Take to Protect Yourself
These are many of the protections we’ve built into Anaconda to keep you safe, but there are also practices you can adopt regardless of what tools you’re using. Many of these tie directly back to the lessons from the past two weeks:
Pin your dependencies and use lockfiles
The Axios attack hit hardest in environments using floating version ranges (like ^1.14.0), where a simple npm install would silently pull the compromised version. The same risk applies in Python. Use lockfiles, pin to specific versions, and in CI/CD always install from the lockfile, not from a loose requirements spec.
Adopt cooldowns for public packages
If you’re pulling directly from PyPI or npm, configure your tooling to wait before installing freshly published versions. In pip, --uploaded-prior-to gives you this. In conda, the defaults channel provides a natural cooldown through our curation process. For conda-forge and other community channels, cooldown tooling is still emerging (there’s an active discussion at conda/conda#15759).
Audit your CI/CD secrets and publishing credentials
The entire TeamPCP campaign was built on stolen CI/CD credentials that cascaded from Trivy to LiteLLM to further downstream targets. Review what secrets your pipelines have access to, enforce short-lived tokens where possible, and use trusted publishing (OpenID Connect, or OIDC) instead of long-lived API tokens for package publishing.
Get your packages from a trusted source
If you’ve read this far, you already know why that matters. Whether it’s Anaconda for the Python ecosystem or another curated provider for your stack, the pattern is the same: someone who builds from source, tests what they build, and takes responsibility for what they ship will always be a safer bet than installing the latest from a public registry where anyone can upload anything.
One final point. The LiteLLM vulnerability was not discovered because someone deliberately installed it. A researcher at FutureSearch was testing an MCP plugin in Cursor, and LiteLLM was pulled in automatically as a transitive dependency. The compromise was only caught because of a bug in the malware: a flaw in the payload caused an accidental fork bomb that crashed the researcher’s machine. Without that mistake, the credential stealer would have run silently for days or weeks. This is a dynamic that has played out multiple times in the past, notably the XZ Utils backdoor.
You can follow every best practice and be meticulous about what you download directly, but you can still end up infected by something three levels deep in your dependency tree that you’ve never heard of. The game is changing. As we adopt more agentic tools that make decisions about what goes into our environments, the risks are compounding. The supply chain isn’t just your dependencies anymore. It’s your dependencies’ dependencies’ dependencies.
Ready to take action?
Talk to our team to request an assessment of your software supply chain and learn how Anaconda protects your environment.