Data management takes up a wide swath of time, power and man-hours on a daily basis inside an IT system. It also can be the most complicated part of any IT system.
Think I’m kidding? Take a look at QuinStreet’s Webopedia page on data management and see how many different components and/or subcategories there are in this category. Alrighty then.
A good data management system is like an efficient library: It’s only as good as its ability to store, retrieve and produce files quickly and efficiently in order to get a task or workload completed in a timely fashion.
Despite that simple definition, there’s always a lot to be said about how conventional DM systems work or don’t work. There’s always room for a new idea, as long as it works.
Here are predictions and perspectives for 2018 from knowledgeable professionals in this sector.
Guy Churchward, CEO of DataTorrent: Commoditization of enterprise applications is on the rise.
“Long gone are the days when critical enterprise applications were developed in multi-year cycles. The rapid speed at which companies, and their adversaries (ransomware, account takeovers, etc!), are evolving requires the technology industry to change its enterprise application development and delivery model. The market is moving so much faster now. Enterprise-grade applications needs to be deployed in weeks or months and certainly not years, but new apps generally don’t have the ‘bake in’ time that gives them that industrialized robustness. A general rule of thumb is if a customer knows they need change then they’re already late. Commoditized enterprise applications give customers critical building blocks at enterprise grade getting them to an outcome-based result within a quarter or two at most. The demands are such that there won’t be any leeway for availability or timeline so let’s just get used to developing in the digital economy.”
Greg Hoffer, VP of Engineering, Globalscape: Modernization and digital transformation forges ahead.
“We continue to see companies emphasizing modernization and digital transformation in order to realize the benefits of availability, resilience, performance and cost management. There will continue to be challenges to face, though, in ensuring that digital transformation results in appropriate solution architectures that provide appropriate security, visibility and compliance.
“A digital transformation does no good if the end result is worse than the original. Challenges include selecting the right cloud platforms and services; finding the right personnel to architect the appropriate solution; and creating an effective coordination between the disparate cloud services, SaaS offerings, API endpoints, etc.
“In 2018, we will see increasing emphasis put into digital transformation, including increased migration of traditional workloads to cloud services, as well as more adoption of the various tools and services that continue to evolve to help with these transformation strategies (like iPaaS, API management and CASB).”
- iPaaS (Integration Platform as a Service): “We’ll see a continued evolution of the iPaaS market, with entrenched, traditional integration players (IBM, Informatica, Mulesoft, SnapLogic, etc.) growing their cloud based iPaaS capabilities and consolidation of the iPaaS market’s vast number of players. This is all good news for consumers, because increased iPaaS capabilities help their digital transformation strategies, in addition to providing robust platforms for gluing together best-of-breed Cloud services in a reliable, cost-effective, approachable manner.”
- Managed File Transfer: “We should continue to see evolution of the traditional, on-premises MFT offerings into true cloud-native and cloud-enabled solutions. Transacting business-across-business relationships and supply chains will continue to be a part of life for commercial enterprises, and thus the need to transfer data to and from internal and external systems will not go away. Deployment options to better support cloud is a necessary change, as is continued expansion of security and compliance capabilities.”
Johnny Ghibril, Vice-President of Technology Solutions at B.Yond:
“In 2018, we will see the first true application of edge computing as it moves from a wireline to a wireless technology.Many operators will realize that the ROI from virtualizing their networks has failed to materialize due to increased operational complexity and poor architectural foundations. Operators will turn to AI to help solve these problems. We will continue to see more adoption of AI, with an increased focus on the cost saving benefit for network management, service assurance and operations.”
Chad Hollingsworth, co-founder and CEO, Triax Technologies:
- Integration momentum will take hold: “In 2018, integrated systems will no longer be optional, but mandatory. While contractors have traditionally been limited by separate methods and tools for estimating, bidding, collaboration and reporting, more firms will demand a single stream of real-time, data-driven insights that can be used to improve project management and execution. Hardware and software providers will offer more options and flexibility than ever before, and a system’s ability to collect data at scale will be the key differentiator.”
- Use of real-time data will increase: “Increased technology usage leads to more sources of insightful data, and IoT early adopters will place an increasingly high priority on reporting, cloud-based dashboards, and the visualization of data. This plethora of new data, emerging tools such as artificial intelligence (AI), machine learning and predictive analytics will be used to unlock insights to enhance decision-making, increase efficiency, and improve profitability at the jobsite.
Kelly Stirman, VP of Strategy, CMO of Dremio: Technology vendors will focus on a new problem: Data consumer productivity.
“For most of the past decade, key areas of technology have focused on improving developer productivity. This includes cloud vendors like AWS, data management vendors like Hadoop, NoSQL, and Splunk, and infrastructure like Docker, Mulesoft, Mesosphere, and Kubernetes. Why? Developers have been the craftspeople responsible for digitizing key areas of society by recasting them as software. Now vendors will start to focus on a new group of users: data consumers. For every developer there are 10 data analysts, data scientists, and data engineers, totaling over 200M today and growing rapidly.
“Everyone likes to say ‘data is the new oil,’ and while products like Tableau have catered to the visualization of data, but there are many steps in the ‘data refinery pipeline’ that are still IT-focused and a million miles from the self-service that developers enjoy today with their tools. Vendors will start to close the gap and focus on dramatically improving the productivity of this critical market.”
Nariman Teymourian, HyperGrid CEO: Everybody will be talking about the “edge.”
“The generation of data at the edge is driving the need for compute at the edge. Machine learning and the internet of things will drive the need for compute to be available closer to the edge so that large volumes of data can be processed quickly and actionable results can be delivered back to the machine. The faster the results can be delivered provides a competitive advantage.”
Marsha Ershaghi Hames, Managing Director, Strategy & Development at LRN: We will see a rise in compliance accountability and enforcement.
“With the surge of claims around misconduct, sexual harassment, discrimination, we can expect 2018 to bring a rise in the review and revision of the regulations at a statewide level. Companies will most certainly respond by reviewing their policies and addressing the issues through targeted training, especially with a spotlight on leadership accountability.
“I expect that 2018 will bring a renewed focus on leadership responsibility of corporate culture, and how this accountability plays in driving the business. Compliance issues will no longer be viewed as a just a legal issue but a business and reputational issue, and this starts with leaders taking ownership, listening up and setting the tone for behavior accountability. The patterns of allowing and/or ignoring inappropriate conduct are being pressure tested under the microscope of today’s standards of full transparency. Gone are the days of corporations–or even Congress–going to great lengths to pay out settlements to protect the valuable brand and reputational assets from making their way into the headlines.”
Adam Famularo, CEO of Erwin:
- Data Governance 2.0 will take hold because Data Governance 1.0 was a failure: “The age of Data Governance 1.0 was characterized by IT serving as data custodians, mainly cataloging data elements without a grasp of their meaning or relationships. Consequently, the costs of controlling data risk often became needlessly excessive and opportunities to drive business agility were missed. It was a total failure. In 2018, Data Governance 2.0 will shine as it moves out of IT’s shadow to encompass the entire enterprise. CFOs, CMOs and all data stakeholders will be involved in data governance, not just traditional data stewards.”
- Data governance will be vital to all aspects of data, including master data management, business intelligence, IoT and AI: “Social media, cars, TVs, refrigerators, thermostats, etc., – everything is connected and producing vast new quantities of data. Big data is only getting bigger. First, organizations will need to be able to process this data at high speed. Second, organizations will need a strategy to manage and integrate this never-ending data stream. Even small chunks of data will accumulate into large volumes if they arrive fast enough, which is why it’s critical for businesses to invest in data architecture, management and governance. All of this new data is disparate, it’s noisy, and in its raw form it’s often useless. So to unlock data’s aforementioned potential, businesses must take the necessary steps to “clean it up.” And with all this data comes tremendous risk. To do this, organizations will need to have an effective strategy to discover, understand, govern and socialize their data.”
- Increased focus on data governance will accelerate digital transformation: “In 2017, digital transformation was all the buzz. However, with enterprises only tapping into less than one percent of their overall data, we’ve yet to see the full impact of the transformation. While organizations have historically viewed data governance as a compliance exercise, the fact is that data governance is the foundation for digital transformation. As organizations expand DG out of IT and into the business, and as organizations dedicate resources to prepare for GDPR, you’ll see data being used in more meaningful and game-changing ways.”
- Data governance will change the way the business views and consumes data: “The business will begin to treat its data assets in the same way it treats physical assets to reduce the regulatory and reputational risks to the organization, as well as see it as a valuable resource that helps employees excel in their day-to-day jobs.”
Patrick O’Keeffe, Executive Director of Software Engineering at Quest Software: Open-source database platforms will see a resurgence.
“Studies have shown that growth in open source database adoption is outpacing traditional database management systems. Though MySQL is currently the most widely-used platform, I predict Postgres will show significant growth and start making a dent in MySQL market share in the coming year. It is built for high volume environments, highly extensible, has a reputation for flawless reliability and stability, and is supported by a commercial vendor offering additional assurance.”
Mark Bregman, Senior Vice-President and CTO, NetApp:
- “Data will become self-aware and even more diverse than it is today. With this, the metadata will enable the data to proactively transport, categorize, analyze and protect itself, in other words, self-govern. Real-time mapping of the flow between data, applications and storage elements will evolve and the data will deliver the exact information a user needs when they need it.”
- “Data will be generated at an unprecedented rate that will greatly exceed the ability to transport it. Rather than moving the data, the applications and resources needed to process it will be moved to the data. This will have significant implications for new architectures like edge, core, and cloud. The amount of data ingested in the core will continuously be less than the amount generated at the edge. This must be deliberately enabled to ensure that the right data is being retained for later decision making.”