Does DevOps Plus Open Source Equal Security?

Source – forbes.com

There is a cost to prioritizing speed over software quality.

Today’s CIO is in a tough spot. As one of the fastest-growing professions, according to the U.S. Bureau of Labor Statistics, we are seeing more newly promoted CIOs enter the job market than ever before. In addition to this flood of fresh CIOs who have limited experience managing increasingly complex IT issues, we have CEOs and boards putting more pressure on IT organizations to deliver updates faster while keeping software agile and guaranteeing software security to prevent damaging outages, as we saw in the case of Equifax.

As a likely consequence, we’ve seen the buzz around DevOps reach a new echelon of popularity. Everyone is trying to capitalize on this silver bullet to achieve digital transformation and release new features and functionality at speeds previously unreachable. This is the new way of doing business — competing on speed.

Many IT organizations are now moving to or doing DevOps simply because they’re being instructed to do so by the business or because this is the answer they get from their peers and the market. Enterprise IT has historically been eager to believe in its own version of Santa Clause — i.e., a magical solution that will suddenly transform old, heavy infrastructures into lightweight, fast and agile platforms. And let’s be honest: If agile was the Santa of the 2000s, DevOps is the Kris Kringle of 2017.

Hence, there’s an incredible amount of pressure for companies to modernize IT through DevOps. This is leading more CIOs and IT organizations to look toward Silicon Valley and the Fortune 100 for what drives success. We have all heard the stories of Netflix releasing new software packages at least 3,000 times per month and Amazon reaching 11,000 releases annually. While this has created lots of value for these companies very quickly, their pace has remained largely out of reach for enterprise CIOs.

A big reason for that is software complexity. Enterprise software today is incredibly complex. The typical enterprise software system, say, at a large banking organization, is comprised of lots of layers of software that have been added periodically by different teams, with each team not necessarily understanding the scope of the application they have enhanced or the existing vulnerabilities that may resurface due to new code that’s been added.

But now that being agile and delivering innovation at speed have reached board-level status, the voice of the development team is becoming largely overshadowed by, in more cases than not, naïve business-level decisions that don’t consider software complexity. Even leading analyst firms are hopping on the bandwagon. Forrester, for example, has reported (registration required) that, “Agile and DevOps practices enable innovation at speed, with quality” and that, “Applying agile and DevOps practices enables faster delivery, higher quality and lower risk.”

This pressure on development teams to become agile and work at DevOps speeds has also led to an increase in the use of open-source software. Because open-source components deliver packaged software that’s ready to use to handle complex tasks, development teams can create more software, faster. In these cases, open source is an enabler of DevOps.

However, a hidden danger in increasing reliance on software you haven’t developed is that it typically carries with it performance and security risks, which must be properly identified and fixed before an application goes into production. And when they’re not, well, they can be pretty damaging, as seen with Equifax.

While one could rightly argue that open-source software is some of the most vetted in the market, there are often sparse records about what version of said software has been adopted by a given IT organization. This makes it difficult for teams to test the correct version of the software, leaving them to guess about the quality and security of the application they are building. Add to this the fact that older applications might have been developed using open-source components in which new security issues have been identified, and the task of the CIO becomes even harder.

Unless developers are logging the exact open-source code they use in an automated fashion, compiling that information later will be a best-guess scenario, leaving the origins of the software unclear and/or unknown. This means security vulnerabilities documented by groups like OWASP and NIST can squeak by unnoticed and leave business-critical software vulnerable to attack.

Hackers and cybercriminals are aware of these vulnerabilities, as well as where they live in open-source code.

A common practice by hackers is to identify these security vulnerabilities in open source and then follow companies that appear to be using open-source code as a foundation for custom software development. All they must do is monitor and wait for the right time to exploit the vulnerability. This might be seen in the form of ransomware attacks or a theft of sensitive information like we saw with Equifax.

According to analyst firm Canalys, “High-profile ransomware attacks and increasingly sophisticated phishing techniques have proved the need for businesses to reinforce their IT security to safeguard data assets and ensure continuity of operation.”

So as CIOs continue to search for the next Santa Claus to deliver value faster, they must not forget to address software complexity and security risks in open source components.

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x