DevOps transformation headaches: who you gonna call? Silo busters

Source:- itproportal.com

In the classic Ghostbusters movies para-psychologists save New York City from being overrun by ghosts. While there may well be some long-deceased software lurking on a dusty mainframe, today’s large organisations are not being held to ransom by ghosts but by the proliferation of data silos. Many of these new silos are an unexpected fall-out from the digital transformation drive that is sweeping big business. The task of silo busting just got tougher.

We’re now in a world where Chief Transformation Officers (CTOs) are in the hot seat and two-speed (or bi-modal) IT is the name of the game. DevOps projects along the lines of those successfully implemented by start-ups and unicorns are injecting a faster pace of change and greater agility into large organisations, with the goal of improving competitiveness. But what works for unicorns isn’t turning out to be quite as smooth a journey for many brownfield enterprises.

Of course, there are some great examples of customer-facing applications built on DevOps foundations that are delivering good results for large organisations. But drill down a little and it’s likely that the data used by these new applications is in a shiny new silo that is not integrated with the other enterprise data residing in legacy silos.

Breaking down silos

This approach goes against one of the CTO’s priorities – breaking down silos. They know that the valuable data amassed in these silos could deliver a competitive advantage over their competitors, both new and old, if only the silos could be easily integrated, delivering a single 360-degree view of their data.

As it’s unrealistic for large enterprises to try to compete by throwing out their legacy IT and starting from scratch with a DevOps mindset, they need a way to integrate the vital ‘mode one’ data stored in core legacy CRM, contract management, and billing systems with the new ‘mode two’ data being collected in contemporary business-centric applications (many of which are customer-facing and may have begun life in the marketing rather than the IT department).

While large organisations don’t need a team of people in brown boiler suits with proton guns to exorcise their software ghosts, they could do with the IT equivalent – a silo buster. Taking a silo-busting approach to data management can spirit away data integration headaches.

Let’s take a high-end car manufacturer as an example. Suppose it wants to improve the customer experience in light of increased competition from newer kids on the block such as Tesla. In response, the firm launches a slick multi-platform, multi-device application, built using DevOps principles, which makes it easier for car buyers to check out new models, do comprehensive car comparisons, price up a customised model, and book a test drive. But to drive real value from this platform, all the data generated from these customer interactions needs to be integrated with the manufacturer’s existing dealership data, which lies in legacy silos, including CRM systems, billing systems, production planning databases etc.

Without the ability to get a 360-degree view, the firm will struggle to see that Mr. Carmad in Birmingham, who has just booked a test drive, was a trusted lease hire customer for many years before suddenly disappearing from view in 2015. Having all of these rich customer insights makes it much easier to personalise the engagement with Mr. Carmad and bring him back to the fold.

The truth is that these new apps can’t live independently from the legacy applications if the business is to thrive. So how can large enterprises blend the more static but resilient legacy IT systems with the newer, more agile and perhaps more volatile ones?

A data-centric approach

There are two ways in which CTOs can approach this two-speed IT disconnect:

  • Focus on creating processes that establish a model for integration. The goal is to ensure that the data collected in one app gets sent over to the old system(s) and stays in both silos. However, the value of this approach is limited because it allows no relation or queries between silos, and any transactions across silos are complex and cumbersome to orchestrate and manage.
  • Take a data-centric approach and aggregate all the data from the old and new silos in an operational data hub (ODH). The hub works as the glue between the old and new IT systems, creating a virtual filing cabinet of all the organisation’s data that can be mixed and matched to unearth new business critical insights. This model provides the best outcome for organisations undergoing digital transformation, because it overcomes the data integration challenges yet offers the utmost flexibility.

This data-centric approach is made possible by building an operational data hub on top of an enterprise NoSQL database that supports all data types.  All data from the new, DevOps-based applications is stored directly in the hub, while data from legacy applications – which reside in their own legacy database silos under, for example, the firm’s CRM and billing systems – is virtualised in the hub on demand. This aggregated view makes it easy to search and analyse ever-changing tracts of data across silos to provide actionable data insights.

Predictably this model still demands some new processes, but these aren’t onerous.  With the right underlying database, applications are very easy to programme and there is no need to write to many different APIs because the database takes the strain.

It also removes the growing concerns within many large organisations about the widening gap in expertise between the ‘mode one’ IT guys who are steeped in Cobol, mainframes and SAP and close to pension age, and the ‘mode two/DevOps’ Shoreditch hipsters who are not long out of university, experts in JavaScript and Scrum, and not used to being in positions of great responsibility. As the mode one teams start to retire, the presence of an operational data hub as a ‘mode 1.5’ transformation midpoint will help to ensure that the data residing in the legacy systems is still accessible by the new generation of IT teams via the APIs, until such time as the legacy systems are themselves retired.

Choosing the right enterprise NoSQL database can be confusing, but some offer much more functionality than others and can help not only with digital transformation challenges, but also with regulatory compliance and governance.  A database that supports full ACID transactions, as well as ‘five 9s’ availability, elasticity, scalability and disaster recovery should be the starting point. On top of that it makes sense to choose one that features:

  • Security and privacy. Legacy systems were typically so isolated that security wasn’t a major concern. By contrast newer customer-facing applications are more vulnerable to being hacked, as we know only too well from the ever-increasing list of high-profile breaches that make the headlines. Virtualising the legacy data in the data hub, where it mixes with the ‘new’ data could compound these security challenges.
    Many of the NoSQL database vendors promote their product’s security functionality, even though they are not fit for purpose in today’s security breach-ridden landscape. Be sure to select a database platform that incorporates government-grade security, which offers the maximum security against any attempts by external hackers or disgruntled employees to cause havoc with your data.
    Also, choose a database that allows your organisation to meet the requirements of the EU General Data Protection Regulation (GDPR), which sets out new responsibilities for organisations to adhere to regarding PII (personally identifiable information) data.
  • Search, semantics and geospatial. Having all this rich data in one place is not much use unless it can be searched and analysed quickly and easily. Today a database platform that offers Google-like search is a prerequisite, while integrated semantic and geospatial capabilities are rapidly becoming indispensable too. Semantics makes it easier to discover new relationships, patterns and trends in your data, while filtering and layering geospatial data/content such as text, imagery, and video adds an extra dimension to information analysis for customer/citizen 360 applications.
  • Bitemporal. This capability allows organisations in regulated sectors such as financial services to minimise risk through “tech time travel”—time stamping and rewinding documents to identify changes by looking at data as it was over the course of time. For example, investment banks can use bitemporal capabilities in their business-focused reporting applications required by the regulators, which of course rely on checking and balancing data drawn into the data hub from legacy systems.

As DevOps moves towards becoming mainstream in the enterprise and public sector, the reality of how to integrate ever increasing volumes and types of data from new customer- and business-facing applications with the vast volumes of legacy data looms over many organisations. Right now we are seeing just the tip of the digital transformation iceberg. Two-speed IT will be here for a while yet, meaning that finding the digital data equivalent of a ghostbuster to break down those silos is no longer a phantom concern but a very real one.

Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x