Predictions 2019: How Enterprises Can Improve Software Delivery

eWEEK DATA POINTS: Industry leaders at Delphix, Electric Cloud, IBM, JFrog, Pivotal, XebiaLabs and Datical identify the primary areas on which organizations should focus in 2019 to continue improving software delivery outcomes.


DevOps isn’t a thing you buy; it’s a process you adopt, such as agile software development. DevOps is about constant improvement and seeking to move from DevOps 1.0 to 2.0 and further.

DevOps is a software development methodology that combines software development with IT operations. The goal of DevOps is to shorten the systems development life cycle while delivering features, fixes and updates frequently in close alignment with business objectives.

In this eWEEK Data Point article, industry leaders at Delphix, Electric Cloud, IBM, JFrog, Pivotal, XebiaLabs and Datical identify the primary areas on which organizations should focus in 2019 to continue improving software delivery outcomes. Two common themes emerge: Organizations need to get faster and need to do so by eliminating manual efforts. Automation, DevOps and agile development all can fit the bill in various ways here.

Andreas Prins, Vice President of Product Development, XebiaLabs:
“In 2019, developers will start freeing themselves from unproductive, burdensome tasks like scripting pipelines by connecting their activities and CI pipelines, such as Jenkins, to the rest of the software delivery team and DevOps pipeline. Integrating in this way will let other team members autonomously monitor the status of feature delivery and help developers reduce interruptions so they can spend more time creating business value.”

Sunil Mavadia, Global Head of Customer Journey, Electric Cloud:
“As organizations continue to make big bets on DevOps in 2019, it’s important to help them safely adapt to technological shifts underway so they can remain competitive. To that end, we are hearing about five ‘big ticket items’ on their radar:

1. Cloud Migration
2. Automation
3. Artificial Intelligence/Machine Learning
4. DevSecOps
5. Upskilling”

Sanjeev Sharma, Vice President, Global Practice Director of Data Transformation, Delphix:
“As organizations are achieving velocity of innovation from adopting DevOps at scale, they can now address two key challenges:

1. Security and compliance: Organizations are realizing the value of continuously delivering smaller batches of changes and validating security and compliance with every sprint, rather than as a separate step just before release. Organizations are beginning to include security teams as first-class members of their development squads, bringing continuous validation right into their dev sprints.

2. Data-driven applications: DevOps practices are now being adopted by data producers and consumers, allowing them to treat data as a deployable asset no different from code, maximizing time to value for their data-driven applications. Data-change cycles are being better synchronized with code-delivery cycles. This area, however, is still in its infancy. Organizations will need to make the necessary transformations across tools, practices and skills to manage, change and collaborate around data, like code.”

Kit Merker, VP Business Development, JFrog:
“The biggest challenges that DevOps will help solve are the ones the success of DevOps creates. More frequent releases mean more binaries, more storage, more data to manage—contributing to 44 zettabytes worldwide by 2020. As we continue to generate more metadata about those binaries, on their origin, behavior and security, we can fuel machine learning to bring even more automated improvements and speed to DevOps. It may be a while before the promise of AI in DevOps is fully realized, but we believe we will start to see some exciting advances in 2019.”

Eric Minik, Product Management Lead-DevOps, IBM:
“Establishing continuous delivery pipelines is no longer a fringe activity. It’s quite mainstream and in 2019, we are going to see more attempts to drive it across the enterprise. I expect this effort to scale continuous delivery (CD) to result in three big trends. First, selections of approved enterprise toolchains that can drive CD for every application. Second, and conversely, other enterprises will embrace a multitude of CD solutions, standardizing on one per major platform. One set of tools for containers, another for mainframe, another for Java apps and another for databases. Coordinating across that diversity will lead to the third big trend: an increased emphasis on release management and measuring delivery effectiveness and flow across the enterprise.

"With so much data flowing and an emphasis on release management decision making, look out for more AI capabilities in your toolchains.”

Dormain Drewitz, Senior Director, Product Marking, Pivotal:
"In 2019, we will see more mainstream adopters of DevOps enjoy the benefits of automated patching, with patching cycles collapsing from months to weeks. Software-defined networks will enable more companies to create and operate application platforms that have the network layer treated as code. While developers can't ignore the network entirely (latency and network unreliability are realities), network-as-code will help DevOps teams tremendously. Finally, we will see scattered examples emerge of cloud-native data architectures from non-internet companies. Domain-driven design will be a common foundation for those advancing DevOps practices to include data."

Robert Reeves, Co-founder and CTO, Datical:
“From my perspective, we will definitely stop hearing it referred to as ‘the DevOps.’ Or, I hope so! As DevOps adoption increases, we will start to see adoption by teams that previously were not even thought of on the first iteration of DevOps. This follows the same path as agile when we started seeing things such as IaC (infrastructure as code) and will be driven by clear benefits that those teams (security, database, network) see happening in other areas. Also, we’ll see another security breach directly attributed to manual change and unpatched dependent libraries. Sigh …

“Tom Petty was wrong; ‘waiting’ is not the hardest part. Continuing to improve is the hardest part. DevOps is not something you do once and claim victory. Similar to going to the gym and eating better, you must continue to find areas of manual effort and eliminate them. And, yes, executing a script is a manual effort. From the application, to the infrastructure, to the database, to security, we are mired in manual efforts that DevOps can remove. Simply because it is the database or system patching or security and is previously viewed as 'too important to leave up to machines,' we need to change that thinking and say it is 'too important to leave up to humans.'”

If you have a suggestion for an eWEEK Data Point article, email [email protected].

Chris Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is Editor-in-Chief of eWEEK and responsible for all the publication's coverage. In his 15 years and more than 4,000 articles at eWEEK, he has distinguished himself in reporting...