In a blog post published in March 2026, Daniel Stenberg, creator and lead developer of curl, makes the case that the software industry's default position of trusting well-known components is no longer adequate. Stenberg argues that users and organisations should actively verify the software they consume, and he uses curl's own practices as a concrete example of how that can be done.
Curl runs in an estimated tens of billions of devices, making it one of the most widely deployed software components in existence. Stenberg lists a range of scenarios in which a project at that scale could be compromised, including a malicious contributor merging tainted code, a breached committer unknowingly distributing modified releases, an extorted team member making unwanted changes, or a hacked distribution server serving altered tarballs. He notes that these scenarios can occur independently or in rapid sequence, and that the consequences of a successful attack on a project of curl's reach could be severe.
"Software and digital security should rely on verification, rather than trust. I want to strongly encourage more users and consumers of software to verify curl. And ideally require that you could do at least this level of verification of other software components in your dependency chains."
- Daniel Stenberg
The curl project has put in place an extensive set of controls intended to make the git repository the authoritative and auditable source of truth. These include enforcing a consistent code style, banning the use of certain C functions deemed difficult to use safely, imposing a ceiling on function complexity, requiring human and automated review of all pull requests, and prohibiting binary blobs and most uses of base64-encoded content, both of which could be used to conceal malicious payloads. Stenberg also describes more than 200 CI jobs that run on every commit, builds using strict compiler settings that treat warnings as errors, continuous fuzzing via Google's OSS-Fuzz project, and mandatory two-factor authentication for all committers. Each of these is designed to make any deviation from expected behaviour visible to anyone following the project.
On top of those internal controls, Stenberg makes the case for a wider verification ecosystem. He explains that the project provides signed release artefacts and a dedicated verify page on the curl website, so that independent users can check that a release contains only what is in the git repository and that it was signed by the release manager. He acknowledges that he cannot know who those users are, or whether they currently exist, but argues that even a small number of independent verifiers is enough to provide a meaningful check: one of them can raise the alarm if anything looks wrong.
"If even just a few users verify that they got a curl release signed by the curl release manager and they verify that the release contents is untainted and only contains bits that originate from the git repository, then we are in a pretty good state."
- Daniel Stenberg
Stenberg ends his post with a direct recommendation to require this verification for all dependencies, stating that "software and digital security should rely on verification, rather than trust". Community discussion from before April 2025 echoes this position in several ways. On LinkedIn, practitioners in security and platform engineering have argued that the XZ Utils backdoor, discovered in 2024 and involving a long-running effort to insert malicious code via a trusted contributor, showed the limits of reputation-based trust, such as in this post from Cameron Stihel and this post from Ryan Johnston. The attack, which targeted the liblzma component by gaining the confidence of maintainers over time before inserting code changes, is precisely the kind of scenario Stenberg describes in his list of threat vectors.
One of the structural tools now available for expressing exactly what a piece of software contains is the Software Bill of Materials. In a talk at QCon London 2026 covered by InfoQ, Viktor Petersson, founder of sbomify, argued that teams are running out of time to adopt SBOMs. He cited the EU Cyber Resilience Act, which opens its first enforcement window in September 2026 and requires full SBOM compliance by December 2027, and warned that its consequences go beyond fines: "CRA is not about fines. They can actually block sales. Your products can be blocked from the European market." US Executive Order 14028, in force since 2021, makes SBOMs a procurement condition for software sold to the federal government, and the FDA requires them for medical devices.
Petersson's talk addressed the full lifecycle of SBOM production, including the step that most teams skip: signing. He was direct that this is a mistake, and that the specific tooling matters less than the act of signing itself, as this provides a verifiable chain of custody. Petersson was blunt: "Any signing is better than no signing. Do sign your SBOMs in your pipeline, not on somebody's machine." This connects directly to Stenberg's argument: curl already provides signed release artefacts and details the verification steps clearly, giving consumers the chain of custody that Petersson describes as the goal.
CI/CD pipeline are also a potential weak point. InfoQ covered the compromise of a widely used GitHub Action in April 2025, which highlighted how a single malicious or compromised action can expose secrets and build artefacts across many projects simultaneously. The incident reinforced calls for tighter controls on third-party actions, pinning dependencies to specific commit hashes, and monitoring for unexpected changes in CI tooling. Stenberg's approach addresses this directly: the curl CI jobs are configured to access the source repository read-only and are checked with the zizmor tool to reduce the risk of insecure job configuration.
Petersson also pointed to the lifecycle challenge, noting that a real product often has dozens of SBOMs that change on every CI run and that regulators can request the SBOM for a specific past release. He compared current practice to software development before version control: "Dealing with SBOMs today feels like managing source code in the 90s, with patches sent over email." This governance issue leans into Stenberg's broader point. The tooling to produce, sign, and verify software artefacts exists, and the regulatory pressure to use it is building, so organisations should close the loop by verifying what they consume.