4 experts predict what’s coming in 2018: DevOps, AI, and more
Source – jaxenter.com
Predicting the future is a hard gig. For every right guess, there’s always an over-excited promise that falls short of reality. However, we’re taking a lot of the guesswork out of predicting what’s hot in 2018 by asking the experts what they think.
While nothing is set in stone, it looks like 2018 is going to bring a lot of growth for DevOps, cloud technologies, and data science. However, signs are unclear for what will happen to artificial intelligence and other trendy technologies.
Without further ado, let’s see what our experts had to say!
Patrick McFadin, Vice President Developer Relations, DataStax
AI will go deep into the “trough of disillusionment” The largest users (Facebook, Google) make it look easy but companies without the deep experience aren’t seeing the same results.
Stream processing data will become further integrated into standard backend databases
More companies will embrace the multi-cloud as competition heats up between cloud vendors and fear of lock-in becomes more prevalent.
Graph database use cases will become less art and a lot more science as the technology matures
“Data Autonomy”- fear of the big cloud players will become the main driver for large digital transformation projects. More and more brands will want data autonomy in a multi-cloud world in order to compete and stay ahead. The need and urgency to meet the big cloud players head on with data driven applications will intensify.
Chris Carlson, Vice President Product Management, Qualys
JAXenter: How will the role of DevOps/DevSecOps change in 2018?
Chris Carlson: DevOps continues to grow in usage and importance for enterprises of all sizes. Security teams need to understand that DevOps is quickly changing how IT operates and need to partner with IT and application development teams much earlier in the planning and execution lifecycle, building security into the DevOps pipeline instead of bolting on after the fact, which will create successful DevSecOps programs for organizations. Security teams that try to enable DevSecOps by procuring point solutions that don’t integrate with existing security technologies, processes, and reporting will actually create even more security silos and introduce blockers that slow down the speed, agility, and automation that DevOps delivers.
JAXenter: What do companies need to do in 2018 to improve DevOps processes?
Christ Carlson: Organizations have a lot of opportunities to implement and improve DevOps processes for their development projects. There are many service companies, consultancies, advisories, and vendors that are focusing on helping organizations implement and gain success through DevOps tools and processes.
What is lacking, however, is the seamless and transparent integration of security into DevOps aka DevSecOps. The number of security practitioners knowledgeable in DevSecOps is still low, vendors focused on endpoint and perimeter security have no incentive to evangelize DevSecOps, small start-ups only providing limited toolsets for DevOps security don’t have a wide enough solution or big enough mindshare to get any meaningful traction.
It’s important for companies to establish a culture of DevSecOps and collaboration to emphasize the importance as these processes affect a business’s earning potential and growth.
JAXenter: What will the biggest threats of 2018 be?
Chris Carlson: Threats targeting enterprises will continue to increase, including ransomware against endpoints, organized crime and industrial espionage against endpoints and servers, and application attacks against web applications. Many enterprises still lack the ability to protect their enterprises by engaging in good security hygiene (identify and patch vulnerable systems, lockdown open configurations, identify and fix web application attacks). They still lack the ability to hunt, detect, and respond to threats that have bypassed prevention controls and are active inside the environment operating undetected. I expect that we will start to see data breaches against enterprise applications running in cloud environments where previously the successful disclosed attacks have been against on-premise applications.
Matei Zaharia, Chief Technologist at Databricks
A rise in organizations offering datasets for competitive advantage.
Although we already have some of these in verticals, we are going to see a massive influx in organizations offering real-world intelligence to other enterprises for a competitive edge. Users will pay by the data set or query, and the difficulty around data cleaning, processing, and machine learning will be eliminated.
Data scientists will continue to grow in number. Data science and computer science are now some of the most popular majors for college students. Although an influx of new graduates will not quench the demand for these roles, we will see more and more organizations starting data science teams and data products.
AI will find more business use cases, starting with verticals. Generic machine learning platforms are difficult for organizations to use, but vertical-specific solutions to common business problems will start to incorporate the newest ML techniques and transform the standard business processes.
Deep learning frameworks will start to converge and move up in abstraction. Although there are currently a large number of frameworks, many of them offer similar feature sets, and with efforts like ONNX, the basic tasks of defining and serving models will be handled well. Instead, developers will start to focus on making these frameworks easier to use for higher level business applications.
The cloud will enable new data application architectures. We are already starting to see that data management systems for the cloud can do much more than on-premise systems deployed by forklift onto virtual machines. We expect new offerings from a variety of vendors that take advantage of the scalability, elasticity, availability and ease of management of the public cloud.
Christian Beedgen, Chief Technology Officer, Sumo Logic
AI will not transform the enterprise in the near future.
Previous predictions and claims about the direct impact of AI on enterprises have been overblown. There is excessive hype around how AI will lead us to new discoveries and medical breakthroughs. However, those expecting AI to be the ultimate truth conveyer are mistaken. It will be very hard to design a model that can determine unbiased truth because human bias – whether explicitly or implicitly – will be coded into these data analytics systems and reinforce existing beliefs and prejudices.
With that said, there are certain applications where systems can make better decisions in a shorter amount of time than humans, such as in the case of autonomous vehicles. In 2018 we will begin to see real use cases of the power of AI appear in our everyday lives — it just isn’t ready to be the shining star for the enterprise quite yet. When you look at the maturity of the enterprise, only half of the Global 2000 offer fully digital products. So, despite all of the buzz around digital transformation, there’s a lot of catch-up to be done before many of these companies can even consider looking at advanced developments such as AI.
Demand for multi-cloud, multi-platform will drive the need for multi-choice.
Over the past few years, there has been much debate within enterprise IT about moving critical infrastructure to the cloud – specifically, around which cloud model is the most cost-effective, secure and scalable. One thing is for certain – the cloud is the present (and future) of enterprise IT, and legacy companies continuing to predominantly or solely house their infrastructure on-premises to support existing or new modern applications will become increasingly irrelevant in a few years from now as their competitors will prevail.
Moreover, this problem is further exacerbated as cloud users are demanding choice, which is going to drive massive growth in multi-cloud, multi-platform adoption in 2018. As a result, enterprises will need a unified cloud-native analytics platform that can run across any vendor, whether it’s Amazon, Microsoft or Google, including what’s traditionally running on-premise. This agnostic model will serve as the backbone for the new world I refer to as the analytics economy, defined by positive disruption at every layer of the stack.