Embracing Obligations: Regulation as a Driver for Quality
By Urs Fässler
As software touches every part of life, people expect higher standards for quality, security, and reliability. The Cyber Resilience Act (CRA) reflects this shift - a necessary response to past industry mistakes. Rather than resisting, we can use this regulation to improve our practices. Here’s why regulation is necessary, how we reached this point, and how we can use it to create better software.
In his 2014 article, Robert C. Martin highlighted the weighty influence and responsibility of software developers. He asked:
Will we ignore the power in our hands and remain a disorganized band of rumpled hapless vagabonds? Will we continue on our undisciplined course, blown by the chaotic winds of business and government, until one of us finally blunders badly enough to wake the sleeping giant of government regulation? Or will we recognize the power we have and decide to use it? And if the latter, will we use that power for good, or for evil? Or will we take responsibility for that power and promise only to wield it in service to our society?
The Obligation of the Programmer. - Robert C. Martin
A decade later, it’s clear that government regulation is now awake and watching. The CRA and other regulatory frameworks signal that the unregulated freedom of the software industry is giving way to standards that prioritize security, resilience, and accountability. While regulation might seem burdensome at first, it’s also a response to our industry’s often chaotic approach to quality and security. By embracing these changes, we can set the stage for stronger software practices that don’t just comply with the law but actively build a safer, more reliable digital environment.
Why Regulation Became Inevitable
For years, the software industry operated with a level of freedom that fields like healthcare, finance, and aviation never had. While this allowed for rapid innovation, it often came at the expense of quality and security. Failures, data breaches, and vulnerabilities exposed the risks of relying solely on self-regulation, often leading to speed being prioritized over safety. The CRA emerged as a formal call to address these risks, enforcing minimum standards for resilience and security in the systems our society depends on.
Examples of Software Quality Failures
The Log4j vulnerability discovered in 2021 allowed attackers to execute arbitrary code on affected systems, causing widespread breaches across applications and infrastructures. This incident underscored the risks of third-party dependencies and the need for proactive vulnerability management.
In 2024, a faulty update to CrowdStrike’s Falcon Sensor disrupted IT systems across sectors including airlines, hospitals, and financial institutions. This event emphasized the importance of thoroughly testing software updates, particularly in security applications.
In Switzerland, the Swisscom Emergency Call System experienced a major failure in 2020 and 2021, interrupting emergency services and delaying response times. This highlighted the critical need for reliable software in essential services and rigorous testing before deploying updates.
CRA Guiding Better Practices
At its core, the CRA aims to make software secure, resilient, and reliable. While it may feel restrictive, it’s a structured framework that helps create a more trustworthy digital landscape. Rigorous testing and validation are central to the CRA, ensuring software can handle real-world demands reliably and withstand unforeseen challenges. The CRA also emphasizes secure-by-design development, embedding security from the start instead of addressing it only when vulnerabilities arise. By prioritizing security early on, developers can build more resilient systems that handle threats proactively.
Another cornerstone of the CRA is continuous improvement, urging developers to treat resilience as an ongoing commitment, not a one-time goal, aligning with practices that keep systems up-to-date and secure.
The Benefits of a Quality Culture
Adhering to CRA standards provides benefits far beyond regulatory compliance. Building a culture of compliance within development teams fosters customer trust, strengthens reputation, and reduces long-term costs associated with security breaches and failures. Leadership plays a key role in this shift by prioritizing security and quality as core values, providing the necessary resources, and embedding security into every phase of development.
A secure-by-design approach, where developers, security specialists, and stakeholders work together from the start, ensures compliance is integrated rather than added later. Automation supports this culture, with regular security checks and testing as part of the development pipeline. This proactive approach makes CRA compliance part of daily workflows, transforming it from a checkbox requirement into a team-wide commitment to quality.
Turning Compliance into Leadership
The CRA isn’t just a regulatory hurdle; it’s a chance for software developers to lead in building safer, more resilient systems. With the growing role of software in daily life, society expects our systems to be secure, reliable, and trustworthy. By embracing CRA standards, developers can set new benchmarks for quality and accountability, building systems that meet today’s high expectations.
Rather than viewing regulation as restrictive, we can see it as a guidepost to strengthen our practices and create a foundation of trust. Answering this call to action, we set the stage for a culture of quality in software engineering that meets regulatory standards and moves the industry forward. After all, if software truly does "rule the world," it’s our duty to ensure it’s a world that’s safe and sustainable.