Let’s Start Designing More Secure Software and Systems

Feb. 14, Frederick R. Chang (and co-author Steven B. Lipner), professor and chair of the Department of Computer Science at SMU Dallas, for a piece advocating for greater cybersecurity safeguards in software and systems. Published in Inside Sources: http://bit.ly/2SPK1G9

Cyber-insecurity is our new normal. Over the last decade we’ve suffered an alarming increase in the frequency and severity of cybersecurity threats and attacks.  It’s estimated that by 2021 the cost of cybercrime will reach $6 trillion annually worldwide.  The sooner we accept this and redouble our efforts to thwart breaches on the range of digital devices and platforms we use, the sooner we’ll make strides.

That’s why it is important to elevate one particular vulnerability that gives cyber attackers opportunity: the way we engineer systems and software. . .

Cyber-insecurity is our new normal. Over the last decade we’ve suffered an alarming increase in the frequency and severity of cybersecurity threats and attacks.  It’s estimated that by 2021 the cost of cybercrime will reach $6 trillion annually worldwide.  The sooner we accept this and redouble our efforts to thwart breaches on the range of digital devices and platforms we use, the sooner we’ll make strides.

That’s why it is important to elevate one particular vulnerability that gives cyber attackers opportunity: the way we engineer systems and software.

Computer systems are critical to the function of cars, planes, medical devices, financial systems, power grids and countless other innovations. Our communications devices — phone, laptop or tablet — comprises millions of lines of complex computer code.  But the wonders of technology that make this possible and improve our lives are too short on defensive measures to safeguard them from hackers. This allows bad actors to attack systems because of the properties of the systems:

—Programmers who create software make mistakes that let attackers manipulate that software and steal or manipulate information or crash systems or applications.

—Engineers and administrators who deploy and configure systems and networks make mistakes that let attackers bypass security controls.

Organizations and their security teams conduct reviews and run tests to detect and eliminate the problems before they lead to harm — but there are usually too many people creating problems and too many ways that problems can arise. Just as automobile manufacturers have realized that you can’t “inspect quality in” to a car at the end of the production line, information technology organizations must realize they can’t expect cybersecurity professionals to audit security into systems after  they’re  designed, built and operational.

A “cybersecurity skills gap” we have faced for years makes it difficult to staff a force to combat the level and intensity of attacks. One estimate predicts there will be 3.5 million unfilled cybersecurity positions next year. This skills shortage will not change for the foreseeable future, but even if it did, we would still need to use the scarce available talent in the most effective way possible. Taking a lesson from the automobile industry is one key way to do that.

Information technology organizations must use their security experts to enable engineers to build secure systems. As new kinds of security problems are discovered, the experts identify their causes and then create training, guidance and tools the developers can use to root out similar problems and to keep them from making the same mistakes in the future. Some software development organizations have implemented this approach successfully, and some have tried similar approaches to system and network configuration and operation.

Treating secure software and systems as the responsibility of software and system engineers can work well, but only if the engineering population is aware that security is part of their job. They must never bypass security in the quest for performance, rapid development or ease of use. If engineers understand that security is part of their job — and what that phrase means — then the expertise of the security people will be amplified. That won’t eliminate the need for more expert cybersecurity professionals, but it will make their contribution more efficient and effective, and help make our information and systems safer and more secure.

We must work to make cybersecurity breaches a rare occurrence rather than the spike we saw in 2019: In the first nine months nearly 8 billion recordswere exposed, which represented an increase of more than 100 percent during the same period in 2018.  We fear that the public is becoming numb to the constant barrage of hacks and data breaches. Organizations both large and small have generally done a good job at recovery and restoration following a data breech and this is certainly a positive development.

But as we enter a new decade we must do more. While we must continue to be diligent about patching software, backing up data, minimizing downtime and the like, we must place more emphasis on enabling all architects and developers to design and create software and systems that are secure from the outset. Security must not be just a Band-Aid that is applied after-the-fact. Systems can be made more resilient to cyberattacks and we must find the will to commit to do so.

Frederick R. Chang is professor and chair of the Department of Computer Science at Southern Methodist University, and the Bobby B. Lyle Centennial Distinguished Chair in Cyber Security. Steven B. Lipner is the executive director of SAFECode a nonprofit focused on software assurance. They wrote this for InsideSources.com.