A demand for real consequences: Sonatype's response to CISA's Secure by Design

February 23, 2024 By Brian Fox

7 minute read time

In the fast-changing fields of cybersecurity and software development, the importance of creating secure software is more crucial than ever. Recently, my colleagues and I at the Open Source Security Foundation (OpenSSF) finalized a response to the latest Secure by Design RFC from the Cybersecurity and Infrastructure Security Agency (CISA). We discussed various best practices and case studies on Secure Software Development Life Cycle (SDLC), Threat Models for Artificial Intelligence, and the economic impact of software upgrades in response to CISA's request.

While I'm proud of that work, I found that my thoughts on the subject went well beyond what was included in that formal response.

There's a different and, maybe, contrarian viewpoint that I think deserves a wider discussion.

Core issue of our cybersecurity challenges: motivation

The problem, as I see it, is not so much a lack of understanding of what needs to be done to secure our digital infrastructure but rather a deficit in the motivation to implement those measures effectively.

I've come to the conclusion that we MUST consider more stringent enforcement mechanisms; including collaboration with regulatory bodies like the Securities and Exchange Commission (SEC) and the Federal Trade Commission (FTC). By using financial consequences, we can encourage the adoption of secure development practices that are already known. This isn't about new regulations, but about making organizations responsible for software safety and security.

Passing the onus onto device manufacturers and organizations developing software to ensure that it is secure from the beginning and over time, reflects similar regulations guiding consumer safety across other industries.  It is especially important when software controls our health (e.g., internet-connected pacemakers), our transportation and infrastructure (e.g., autonomous vehicles and power plants), and our finances (e.g., online banking applications).

If no other manufacturing industry is permitted to ship known vulnerable or defective parts in their products, why should software manufacturers be any different? In any other industry it would be considered gross negligence.

History shows these are solvable problems

I've also previously explored how to learn from food and physical goods supply chains and apply similar concepts within the software supply chain in a more in depth paper with the Atlantic Council. Those lessons show this is a solvable issue - we just have to actually want to solve it.

There's room for improvement in the existing enforcement frameworks, showing that better software security is possible under current laws. We can push for more secure software practices by making standards mandatory in practice, not just in theory.
As we move forward, it's essential that we continue this conversation, challenging the status quo and pushing for a safer, more secure digital world.

Below is my full response that I've personally submitted to CISA - I look forward to continuing this discussion.

Response to CISA

As CTO of Sonatype, I applaud CISA's leadership in establishing clear and comprehensive software security requirements like Secure by Design and Secure by Default. These guidelines are crucial for raising industry standards and fostering a more secure software ecosystem. As a member of the OpenSSF, Sonatype has been working with others in the industry to address the technical aspects of this RFC, and you can see that joint submission for specific ideas and information we think are relevant.

However, I share a growing concern within the security community – the potential over-reliance on requirements without a robust enforcement mechanism. While outlining best practices is essential, translating them into tangible actions that hold manufacturers accountable for security outcomes is equally critical. We must move beyond recommendations and guidance and hold manufacturers responsible for their inaction.

Sixty years ago, Ralph Nader's book, "Unsafe at Any Speed" led to public outcry. And as a technologist, Ralph Nader's "Unsafe at Any Speed" holds a unique resonance. It wasn't just a critique of automobile design; it was a blueprint for demanding accountability and prioritizing safety in complex systems. Today, as we grapple with the vulnerabilities plaguing our software, software manufacturers by and large still fail to act.

In March of last year, Jen Easterly gave a presentation at Carnegie Mellon titled "Unsafe at any CPU Speed: The Designed-in Dangers of Technology and What We Can Do About It." Just as Nader's work ignited public outcry, Easterly echoes the need for a paradigm shift, moving beyond checklists and towards accountability. This aligns with my firm belief that stricter enforcement mechanisms are critical.

Now, imagine a world without the National Highway Traffic Safety Administration (NHTSA). Would automobile manufacturers prioritize investigating defective vehicles on their own? No, and we know that to be the case because of Nader's work.

Just as the NHTSA holds car manufacturers responsible for safety flaws, we must transition from aspirational guidelines to a system that compels manufacturers to prioritize security throughout the software lifecycle, embedding it into the very DNA of their products. This is not to undermine the vital role of CISA and similar organizations but to emphasize the need for an effective "forcing function" that complements their invaluable guidance.

Instead, this is a call for CISA to shift from guidance to a mission focused on leveraging the power of agencies like the SEC and FTC to enforce software security standards and levy consequences for non-compliance. Financial repercussions, similar to those explored by the SEC for corporate governance failures, could incentivize a proactive security posture within software development.

When it comes to software security, our collective responsibility transcends discussion. In contrast, industry leaders, policymakers, and consumers must unite to prioritize a culture of security within the software ecosystem and take action. Just as Nader's work served as a catalyst for change, we must collectively embrace this opportunity to shape a future where software is not just functional but inherently secure, safeguarding the well-being of all individuals. All communities. All people in the digital age.

Unfortunately, there are parts of the RFC that raise concerns about capturing the culture-shifting power of Nader's work. For example, in an effort to understand the Economics of Customer Demand, the RFC states, "Software manufacturers generally implement the features customers ask for the most. There is a perception that customers are not asking for security in the products they buy." While it is likely customers are not asking directly for security, this exposes a disconnect in treating security as a feature rather than a requirement.

To better understand the cost of slow and delayed action, consider that a year before Nader's book was published the Surgeon General of the United States, Luther Terry, released the Surgeon General's Report on Smoking and Health, creating a turning point for public awareness and links between smoking and cancer. This led to what appeared to be swift action like adding warning labels on cigarette packaging. However, true change would be slow and meandering. Where Nader's book led to swift, measurable action in improving the safety of automobiles, it was decades before similar action related to smoking took hold.

The cost of smoking is tragic, causing the death of hundreds of thousands of Americans in the decades while policy change stumbled. Of course, smoking is a complex issue; it is wrapped in psychology and the nature of human habits. This complexity explains why a slow, systematic approach to change by proxy through education, warning labels, and public awareness did not work. In other words, knowing the health concerns of smoking wasn't enough to change behavior.

Real change only began to take hold because new research revealed smoking did not only have a primary impact on the health of the smoker. As the public learned secondhand smoke had as much risk to those who had made better choices, tides began to shift. Eventually, growing demand by the public for legislation at the local and federal levels drove increasingly strict legislation to ban smoking in workplaces, airplanes, and restaurants to improve public health. But this followed years of unnecessary cost to the public.

At the same time, it was difficult for market forces to drive the change society needed. A handful of people here and there refusing to work or eat in a smoke-filled restaurant didn't have economic incentives to change those habits. In other words, economic demand from the customer (or public) rarely drives change until there is a critical public understanding of what's at risk.

Today, poor software development practices persist because the problem looks a lot like secondhand smoke fifty years ago. The person making the choice for unhealthy habits affects more than themselves. Software that is not secure by design affects everyone at every level.

One car manufacturer did not compel the automotive industry to change. One restaurant banning smoking at the risk of negative profit implications did not protect the health of the public. Change for the good of the public, when the risk is clearly known, requires legislation to ensure the safety of everyone. It requires a voice loud enough for everyone to hear. What we need for software security is the urgency of safety in automobiles. We need an emotional appeal that underscores the human cost of inaction and the secondhand impact.

Insecure software puts our families, friends, and communities at risk – from financial loss to identity theft and even physical harm. This is a collective responsibility. Industry, policymakers, and consumers must work together to foster a culture of security throughout the software development lifecycle. Sonatype is committed to collaborating with CISA and other stakeholders to explore effective solutions that translate requirements into meaningful action.

Once again, I applaud the efforts of CISA. Education, requirements, and processes must absolutely change. But these are not enough. Everyone understands the importance of software security, yet vulnerabilities persist, and attacks become more sophisticated. CISA's Mission is to "lead the national effort to understand, manage, and reduce risk to our cyber and physical
infrastructure."

It's time to explore stricter enforcement mechanisms. Collaborations with the SEC and other regulatory bodies, such as the FTC, could incentivize already documented secure development practices through potential financial repercussions. We must hold organizations accountable for the safety and security of the software they produce, and as we've seen with small glimpses of action through our existing enforcement bodies, it is possible under existing legislation to make a difference.

The time for action is now. I urge CISA to consider a balanced approach that complements valuable requirements with concrete action. Let's move beyond guidelines and explore mechanisms that enforce accountability, encourage innovation, and ultimately prioritize the safety and well-being of our communities in the digital age.

Together, we can build a more secure software future.

Tags: thought leaders, Cybersecurity, government, News and Views, CISA best practices

Written by Brian Fox

Brian Fox is a software developer, innovator and entrepreneur. He is an active contributor within the open source development community, most prominently as a member of the Apache Software Foundation and former Chair of the Apache Maven project. As the CTO and co-founder of Sonatype, he is focused on building a platform for developers and DevOps professionals to build high-quality, secure applications with open source components.