Speed and Security Can Coexist in Mainframe DevOps

Source – devops.com

DevOps teams face a constant tug-of-war in their daily work, balancing the need for speedy rollouts of high-performing (fast, reliable) applications that are secure also. If the team moves too quickly, an overlooked security vulnerability may make its way into production. If the team is not nimble enough to identify those security gaps, it can slow down the entire development process, hampering organizational agility.

The need to strike this critical balance has led to the rise of DevSecOps, or the integration of security into DevOps best practices through automated techniques. Much has been written about the importance of bringing the mainframe fully into the DevOps fold, evolving the platform to keep pace with the speed of modern software development. But what about the security of mainframe code and mainframe data specifically—incorporating the mainframe into the third leg of the DevSecOps stool?

A recent survey found that 72 percent of customer-facing applications are completely or very dependent on mainframe processing, which means there’s a lot of personal, sensitive information flowing through mainframe code and into mainframe data repositories. The mainframe platform itself is the most secure system on the planet, due in large part to its unique function and design. IBM’s new pervasive encryption is a groundbreaking development for enhancing data security on the mainframe even further, enabling data at rest (in traditional VSAM files and databases) as well as in transit to be encrypted at a pace of 12 billion transactions per day, all with no impact on system throughput speeds.

Bulletproof Mainframe Security

Still, highly personal and sensitive data can be compromised if steps aren’t implemented to ensure mainframe code and data are as inherently bulletproof as the supporting platform. What steps can help mainframe users achieve the conflicting goals of fast mainframe development combined with superior security?

Privatized Test Data

Speed is critical in all aspects of DevOps, including application testing. Mainframe transactions are the heartbeat of modern applications, making testing critical to ensure these components are working properly. Many organizations prefer using live, mainframe-based production data in testing, as this provides the most realistic view of how an application will work “in the wild.” However, in the rush to pull testing data, many organizations have fallen into sloppy practices including failing to disguise this data, which puts sensitive, personal data at risk of exposure. Companies needing to comply with the EU’s Global Data Protection Regulation (GDPR) face another risk—non-compliance—since they may be using customer data for a purpose (application testing) beyond that for which it was originally procured.

Mainframe users can continue to use real production data in application testing, but they need to take special steps such as using pseudonism, which entails randomly scrambling certain values in a database (for instance, names) so that real customer names are disassociated with other personally identifiable information such as addresses or credit card information. Making it reasonably difficult to associate individual customers with their personally identifiable information is one way to achieve a higher level of information security for production data used in testing. Creating privatized application testing data sets must be a repeatable, automated process that allows testers to quickly get access to the test data they need to expedite testing processes.

Credentialed User Auditing, User Role Management

The DevOps concept entails greater frequency of code changes, which naturally leads to more testing and more developers and testers accessing data and applications. Unfortunately, the involvement of more team members, all working at a faster pace, can sometimes lead to carelessness and inadvertent and unintended security risks. There are techniques available now that allow organizations to see exactly what credentialed users did and what data was accessed. Some may view this as bordering on “Big Brother” type of behavior, but the fact is, with so much sensitive data continuing to reside on the mainframe and growing compliance requirements (GDPR, for example), such surveillance is vital to protecting both the enterprise as well as workers, the majority of whom have no ill intention.

However, user auditing comes with an important prerequisite, and that is better management of user roles. “Least privilege” is a long-held security best practice of restricting access for users only to those resources absolutely required for them to do their jobs. The challenge is when workers change roles, they often are not removed from their previous role definitions and privileged access points, and simply given new ones. Since they may no longer need their old credential sets, this violates the least privilege concept and represents a significant security threat that auditing may not be able to fix. The benefits of auditing—in terms of fostering faster testing and overall development without compromising security—depend on strong user role management.

Automated Rollouts/Rollback

Automated rollouts make the process of releasing code into production faster and more efficient. A dwindling pool of mainframe specialists means that in many cases, workloads spend significantly more time being queued up waiting for available resources, than they do actually being implemented. Removing the need for a developer to manually push new updates into production can be a major benefit in speeding up the mainframe cog. However, approval layer and audit capabilities must be built-in to ensure code security has been tested and verified and the source of any potential problems can be traced back quickly and accurately. Automated rollback functionality is also critical in the event that a security flaw does make its way into production, by enabling developers to immediately roll back to a previous version of the software until the flaw is fixed.

Much progress has been made in recent years when it comes to evolving the mainframe for DevOps, as several examples can attest. But given the sensitive personal data that is often running through, and residing in, mainframe code and data, the scope must expand to DevSecOps, and the steps described here represent a solid initial plan for getting there. Organizations that do this will enjoy the maximum benefits derived from a highly agile, secure mainframe environment delivering unparalleled scalability and reliability for its digital transformation initiatives.

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x