How to Build a Business Case for Application Modernization

A huge portion of IT resources is being wasted on the support and maintenance of outdated legacy applications. A modern, streamlined application environment can help an enterprise save money and remain competitive. Knowledge Center contributor Tim Pacileo offers CIOs five tips to effectively plan and implement a successful application modernization initiative.

bug_knowledgecenter_70x70_%282%29.jpg

CIOs of large organizations recognize the benefits of modernizing applications and moving away from legacy systems. But starting the process-and justifying the investment needed in an application modernization initiative-can be daunting. And too often, the potential gains of a streamlined environment are deferred in favor of a short-term focus on cost containment through maintenance of outdated, redundant and inefficient legacy applications.

But the hodgepodge of systems characterizing many business operations today represents a house of cards that simply cannot be indefinitely sustained. Top-performing organizations are focusing on immediate, specific actions that yield incremental gains, while at the same time formulating a long-term vision to establish a foundation that enables the enterprise to effectively respond to emerging competitive challenges.

The following are five success factors that characterize an effective approach to planning and implementing an application modernization initiative:

Success factor No. 1: Define the stakes

The bottom line is that the mismanagement of application environments is perhaps the single factor most responsible for driving the unconstrained growth of IT spending in business. Over the past 10 years or so, unit costs of IT services have declined substantially. That consistent decline, however, has been more than outstripped by increased demand for IT resources.

pacileo_chart.jpg

If this growing demand for IT resources produced a demonstrable business advantage, all would be well. But, in fact, a huge portion of the increased utilization of IT is being wasted on the support and maintenance of outdated legacy applications. Ironically, the cost-conscious mentality that has allowed IT executives to show steady improvements in terms of operational efficiency is directly responsible for the wastefulness and inefficiency of many application environments today.

The cycle begins when a business application is first developed. To contain costs, the application is often not properly scaled to accommodate growth, resulting in performance issues and increased maintenance and support costs. Applications designed to run three to five years end up being used for seven to 10 years (or sometimes even longer). This happens, again, in the interests of "economy."

When implementing an enterprisewide system, meanwhile, many organizations choose to write interfaces between the new applications and the legacy systems, rather than writing additional functionality into the enterprise solution. The rationale: short-term savings on development resources. However, myriad systems, platforms and applications result, each with a unique set of maintenance requirements and costs. Eventually, this approach drives up IT costs further. As a result, the overriding objective of bringing in the enterprise application in the first place is lost.

Again, addressing these problems is seen as too costly and complex, so a mentality of stopgap measures and muddling through prevails. Plans to prioritize applications, complete the transition to a new solution or to an enterprisewide system, and retire legacy applications-while seen as necessary-invariably get pushed to the back burner.

A chargeback model can be a valuable tool to quantify the costs associated with a short-term, penny-pinching approach, and can help CIOs generate the sense of urgency required to take action. Similarly, one must emphasize the risks of being the "last man standing" with a legacy portfolio, and underscore the reality that no silver bullets exist to enable a quick and easy transition.