When I read congressional testimony from Jen Easterly on January 31, 2024 I was quite surprised. Check out this quote:
Unfortunately, the technology base underpinning much of our critical infrastructure is inherently insecure, because for decades software developers have been insulated from responsibility for defects in their products. This has led to misaligned incentives that prioritize features and speed to market over security, leaving our nation vulnerable to cyber invasion. That must stop.
The discussion over liability for software has been going on for a long time. Dale Peterson touched it in his post last week.
Our world depends on software – for which there are varying degrees of quality control. And the companies selling this software intentionally disclaim responsibility for their products via EULAs. Remarkably – but not unexpectedly – this is true even in the ICS space. Here is an excerpt from a leading ICS vendor’s EULA . I don’t intend to pick on this vendor alone because this type of language is standard practice across the industry.
VENDOR makes no representation or warranty, express or implied, that the operation of the Software will be uninterrupted or error free, or that the functions contained in the Software will meet or satisfy Your intended use or requirements; You assume complete responsibility for decisions made or actions taken based on information obtained using the Software. In addition, due to the continual development of new techniques for intruding upon and attacking networks, VENDOR does not warrant that the Software or any equipment, system or network on which the Software is used will be free of vulnerability to intrusion or attack.
Key passage there “VENDOR does not warrant that the software… will be free of vulnerability.”
Can you imagine requiring drivers who cross a high suspension bridge to sign a user license that says “we do not warrant that this bridge will support your vehicle”?
In my CYBR 3383 Security Design Principles class, one of the principles we highlight is “professional liability” — which means putting one’s name, reputation, honor, career, on the line.
Putting your name on the line is the standing expectation of engineers who sign engineering documents. And it extends to other fields as well.
In general, professional liability is missing from the software industry.
While I am not an expert in software development, I observe that there is a lack of support for software engineering licensure. I am not saying a license should be required in all cases, but certain software is running processes upon which millions of people depend for clean water and reliable electricity every day. Shouldn’t there be some overarching minimum professional standard for these individuals and their work?
Easterly envisions:
We must drive toward a future where defects in our technology products are a shocking anomaly, a future underpinned by a software liability regime based on a measurable standard of care and safe harbor provisions for software developers who do responsibly innovate by prioritizing security.
I’d like to hear more about that…