The How and Where of DevOps Automation
Source – itbusinessedge.com
The key challenge to automating a DevOps environment is figuring out what to automate, and how. Multiple platforms are available to help in this quest, of course, but in the end, the enterprise needs to craft its own unique solution based on legacy infrastructure, operational goals, corporate culture, and a host of other factors.
And as a steady stream of financiers of a certain fictional prehistoric biological theme park keep finding out, over-reliance on automation can be just as problematic as under-reliance.
According to a recent survey by software firm Quasi, about a third of DevOps teams say they have no structured automation program in place yet, and among those that do, only about a third claim to have self-service capabilities for R&D and dev/test processes. This is a problem because, without automation, DevOps cannot achieve the kind of agile development and operational flexibility to support next-generation applications and workflows. It is telling that the survey also reports that half of DevOps teams wait up to a month to get access to infrastructure while the most forward-leaning organizations are cutting this down to mere hours.
Of course, automation exists in many forms in the enterprise and has for many years. The difference between today’s automation and past iterations, however, is that the changes happening now are fundamentally altering the way data environments are built, managed and utilized. Puppet CEO Sanjay Mirchandani noted recently that legacy automation is often too isolated to have a meaningful impact on performance. A more pervasive level of automation is now emerging thanks to virtualization and software-defined infrastructure that allow resources to be managed as code rather than discrete elements in a data chain. Already, companies like Fannie Mae are seeing 20 percent performance gains and 30 percent faster development times just by using the current crop of integrated automation tools.
Automation is more than just technology, however. It takes a concerted effort by all stakeholders to see it through, even if it means the end of long-held industry practices. A key step in this direction is the development of a working governance model, which a recent report from Capgemini and Sogeti has distilled into six key components:
- Compliance by Design – since every organization will have unique security and data-handling requirements, which must be extended across the cloud
- Insights – resource consumption and utilization will no longer be fixed, so you’ll need a way to gauge exactly what your automated systems are doing, and how much it costs
- Continuous Monitoring – more than just keeping an eye on things, CM should proactively identify risk and compliance issues so they can be de-escalated automatically
- Automated Provisioning – to enable applications to move from dev to test to production environments quickly without tying up unnecessary resources
- Integrated CI/CD – script- and template-based DevOps reduces deployment time and supports continual analysis of workflows using an integrated audit trail
- Team Support – proper governance is the responsibility of the entire team and will likely lead to new roles and new responsibilities.
It is important to realize that automation does not give the enterprise the luxury to simply set the data ecosystem on auto-pilot and forget about it. The true value of automation is not to ease the burden on techs and admins but to enhance their ability to make dynamic changes to the data environment in pursuit of new services and new market opportunities.