Four common barriers that impede DevOps at scale
Source – computing.co.uk
The software producers that report the best performance across a variety of metrics are those that release code most frequently. So says Rob Vanstone, technical director at continuous delivery vendor XebiaLabs, speaking at the Computing DevOps Summit this week.
The aim then is to iterate faster. The problem for large organisations is how to reduce the software delivery lifecycle at scale.
“What we see is teams starting to do pockets of innovation, typically scripting, typically automation and then saying ‘hey, look at us’,” he explained. “Management sees those poster children and say ‘we want more of that’, and that’s where the problem starts.”
What works well at a micro level does not necessarily scale up. When you start to ‘industrialise’ DevOps the toolchain becomes stretched and integration between the many components becomes an issue.
In addition to the toolchain there are a number of organisational factors common to most large enterprises that mitigate against reducing the software delivery lifecycle at scale. Vanstone listed four of the main culprits that XebiaLabs and its implementation partners have identified.
The first is the continued existence of old ways of thinking that clash with more Agile ways of working.
“Although the team may have moved to Scrum the overall mindset is still Waterfall with a clear division of skills and labour and no shared ownership,” he said. Such a mindset puts too much of the focus on the technology and not enough on meeting business needs, with the result that the desired outcomes – quicker delivery of better software – suffer.
Second is the fact that the necessary step change is hard to drive through.
“We still see a lot of silos in large organisations, and we see lots of pockets of development where they’re iterating and doing lots of incremental changes which is necessary but how do you industrialise that?” he asked.
It is essential that the leadership is fully involved to drive the change through, said Vanstone, and not just playing an observational role.
“We’re not just talking dashboards, but the whole decision making process.”
Third, teams are often afflicted by what Vanstone called “a false sense of maturity”, their efforts divorced from any tangible benefits.
“Generally [the innovation] is happening in pockets and it’s teams saying ‘he look we’ve automated something and can do what used to take three weeks in three hours,” he said.
“They think they’re mature if they have a level of automation, irrespective of their ability to deliver business outcomes.”
Vanstone added: “IT maturity should result in business value or a better experience,” he went on. “If this is not happening then the case for further investment in automation is questionable.
The fourth common problem is undervaluing QA and testing as a key part of the pipeline. While this may be just about manageable at small scale, as DevOps scales up it will become a serious sticking point – as well as a possible security risk.
“We’ve seen that the central testing practice lacks automation maturity and alignment to DevOps” he said. “The focus is more on user acceptance testing and integration testing which while necessary do nothing to speed the delivery cycle.”
One of the ways to improve that situation is to have proper KPIs and clear goals for what the automation is supposed to achieve.
The tools need to be “joined together in a coherent way so you can get your software in front of your customer as quickly and as safely as you can,” he said.
“What’s really important is to gather data about the process. We can use the data intelligently to show you what you are doing and how you can improve.”